318 views

Uploaded by rakesharumalla

- Python Reference Card
- Python Seaborn Cheat Sheet
- Data Analysis From Scratch With - Peters Morgan
- Python 3 Cheat Sheet
- NumPy, SciPy, Pandas, Quandl Cheat Sheet
- Pandas Python for Data Science
- Numpy Cheat Sheet
- scipy_and_numpy.pdf
- Matplotlib Cheat Sheet
- Python 3 Cheat Sheet
- Beginners Python Cheat Sheet Pcc All
- Python Matplotlib Cheat Sheet
- Pandas Cheat Sheet
- Pandas DataFrame Notes
- Python Data Analysis
- Apress - Introduction to Python for Engineers and Scientists (Nagar) (2018)
- Python for Signal Processing
- Programming for Computations – Python
- Python for Data Science - Cheat Sheets
- Python for Data Science

You are on page 1of 1

Scikit-Learn

Learn Python for data science Interactively at www.DataCamp.com Linear Regression Accuracy Score

>>> from sklearn.linear_model import LinearRegression >>> knn.score(X_test, y_test) Estimator score method

>>> lr = LinearRegression(normalize=True) >>> from sklearn.metrics import accuracy_score Metric scoring functions

>>> accuracy_score(y_test, y_pred)

Support Vector Machines (SVM)

Scikit-learn >>> from sklearn.svm import SVC Classification Report

>>> svc = SVC(kernel='linear') >>> from sklearn.metrics import classification_report Precision, recall, f1-score

Scikit-learn is an open source Python library that Naive Bayes >>> print(classification_report(y_test, y_pred)) and support

implements a range of machine learning, >>> from sklearn.naive_bayes import GaussianNB Confusion Matrix

>>> gnb = GaussianNB() >>> from sklearn.metrics import confusion_matrix

preprocessing, cross-validation and visualization >>> print(confusion_matrix(y_test, y_pred))

algorithms using a unified interface. KNN

>>> from sklearn import neighbors Regression Metrics

A Basic Example >>> knn = neighbors.KNeighborsClassifier(n_neighbors=5)

>>> from sklearn import neighbors, datasets, preprocessing

Mean Absolute Error

>>> from sklearn.cross_validation import train_test_split Unsupervised Learning Estimators >>> from sklearn.metrics import mean_absolute_error

>>> from sklearn.metrics import accuracy_score >>> y_true = [3, -0.5, 2]

>>> iris = datasets.load_iris() Principal Component Analysis (PCA) >>> mean_absolute_error(y_true, y_pred)

>>> X, y = iris.data[:, :2], iris.target >>> from sklearn.decomposition import PCA Mean Squared Error

>>> X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=33) >>> pca = PCA(n_components=0.95) >>> from sklearn.metrics import mean_squared_error

>>> scaler = preprocessing.StandardScaler().fit(X_train) >>> mean_squared_error(y_test, y_pred)

>>> X_train = scaler.transform(X_train)

K Means

>>> X_test = scaler.transform(X_test) >>> from sklearn.cluster import KMeans R Score

>>> knn = neighbors.KNeighborsClassifier(n_neighbors=5) >>> k_means = KMeans(n_clusters=3, random_state=0) >>> from sklearn.metrics import r2_score

>>> r2_score(y_true, y_pred)

>>> knn.fit(X_train, y_train)

>>> y_pred = knn.predict(X_test)

>>> accuracy_score(y_test, y_pred) Model Fitting Clustering Metrics

Adjusted Rand Index

Supervised learning >>> from sklearn.metrics import adjusted_rand_score

Loading The Data Also see NumPy & Pandas >>> lr.fit(X, y) Fit the model to the data

>>> adjusted_rand_score(y_true, y_pred)

>>> knn.fit(X_train, y_train)

Your data needs to be numeric and stored as NumPy arrays or SciPy sparse >>> svc.fit(X_train, y_train) Homogeneity

>>> from sklearn.metrics import homogeneity_score

matrices. Other types that are convertible to numeric arrays, such as Pandas Unsupervised Learning >>> homogeneity_score(y_true, y_pred)

DataFrame, are also acceptable. >>> k_means.fit(X_train) Fit the model to the data

>>> pca_model = pca.fit_transform(X_train) Fit to data, then transform it V-measure

>>> import numpy as np >>> from sklearn.metrics import v_measure_score

>>> X = np.random.random((10,5)) >>> metrics.v_measure_score(y_true, y_pred)

>>> y = np.array(['M','M','F','F','M','F','M','M','F','F','F'])

>>> X[X < 0.7] = 0 Prediction Cross-Validation

>>> from sklearn.cross_validation import cross_val_score

Supervised Estimators >>> print(cross_val_score(knn, X_train, y_train, cv=4))

Training And Test Data >>> y_pred = svc.predict(np.random.random((2,5))) Predict labels

>>> y_pred = lr.predict(X_test)

>>> print(cross_val_score(lr, X, y, cv=2))

Predict labels

>>> from sklearn.cross_validation import train_test_split >>> y_pred = knn.predict_proba(X_test) Estimate probability of a label

>>> X_train, X_test, y_train, y_test = train_test_split(X,

y, Unsupervised Estimators Tune Your Model

random_state=0) >>> y_pred = k_means.predict(X_test) Predict labels in clustering algos Grid Search

>>> from sklearn.grid_search import GridSearchCV

>>> params = {"n_neighbors": np.arange(1,3),

Preprocessing The Data "metric": ["euclidean", "cityblock"]}

>>> grid = GridSearchCV(estimator=knn,

Standardization Encoding Categorical Features param_grid=params)

>>> grid.fit(X_train, y_train)

>>> from sklearn.preprocessing import StandardScaler >>> from sklearn.preprocessing import LabelEncoder >>> print(grid.best_score_)

>>> scaler = StandardScaler().fit(X_train) >>> print(grid.best_estimator_.n_neighbors)

>>> enc = LabelEncoder()

>>> standardized_X = scaler.transform(X_train) >>> y = enc.fit_transform(y)

>>> standardized_X_test = scaler.transform(X_test) Randomized Parameter Optimization

Normalization Imputing Missing Values >>> from sklearn.grid_search import RandomizedSearchCV

>>> params = {"n_neighbors": range(1,5),

>>> from sklearn.preprocessing import Normalizer "weights": ["uniform", "distance"]}

>>> from sklearn.preprocessing import Imputer >>> rsearch = RandomizedSearchCV(estimator=knn,

>>> scaler = Normalizer().fit(X_train) >>> imp = Imputer(missing_values=0, strategy='mean', axis=0) param_distributions=params,

>>> normalized_X = scaler.transform(X_train) >>> imp.fit_transform(X_train) cv=4,

>>> normalized_X_test = scaler.transform(X_test) n_iter=8,

random_state=5)

Binarization Generating Polynomial Features >>> rsearch.fit(X_train, y_train)

>>> print(rsearch.best_score_)

>>> from sklearn.preprocessing import Binarizer >>> from sklearn.preprocessing import PolynomialFeatures

>>> binarizer = Binarizer(threshold=0.0).fit(X) >>> poly = PolynomialFeatures(5)

>>> binary_X = binarizer.transform(X) >>> poly.fit_transform(X) DataCamp

Learn Python for Data Science Interactively

- Python Reference CardUploaded bymarczucker
- Python Seaborn Cheat SheetUploaded byFrâncio Rodrigues
- Data Analysis From Scratch With - Peters MorganUploaded byWallisson Carvalho
- Python 3 Cheat SheetUploaded byFredrik Johansson
- NumPy, SciPy, Pandas, Quandl Cheat SheetUploaded byseanrwcrawford
- Pandas Python for Data ScienceUploaded bychowdamhemalatha
- Numpy Cheat SheetUploaded byIan Flores
- scipy_and_numpy.pdfUploaded byAndrés Chavarría
- Matplotlib Cheat SheetUploaded byTuan Ton
- Python 3 Cheat SheetUploaded byLaurent Pointal
- Beginners Python Cheat Sheet Pcc AllUploaded byjohn smith
- Python Matplotlib Cheat SheetUploaded bysreedhar
- Pandas Cheat SheetUploaded byRoberto
- Pandas DataFrame NotesUploaded byAlex
- Python Data AnalysisUploaded byAmbalika Smiti
- Apress - Introduction to Python for Engineers and Scientists (Nagar) (2018)Uploaded byBernardo Millicci
- Python for Signal ProcessingUploaded byBruno Alvim
- Programming for Computations – PythonUploaded byalfonso lopez alquisirez
- Python for Data Science - Cheat SheetsUploaded byMustafa Ibrahim
- Python for Data ScienceUploaded bySujeet Omar
- Python Machine LearningUploaded byArjun
- Python 3.2 Reference CardUploaded byLaurent Pointal
- 12 Useful Pandas Techniques in Python for Data ManipulationUploaded byxwpom2
- Python 3 Cheat SheetUploaded byAlex
- Jupyter Notebook Cheat SheetUploaded byAnurag Agarwal
- Pandas Cheat SheetUploaded byngyncloud
- Pandas Cheat SheetUploaded byehsanulhaq
- Git Cheat SheetUploaded bykzelda
- Data Wrangling With Python and Pandas-7ppUploaded bycumin
- Numpy_Python_Cheat_Sheet.pdfUploaded byAshish Sharma

- Apache HadoopUploaded byrakesharumalla
- Rl CheatsheetUploaded byrakesharumalla
- Cheatsheet for R programmingUploaded byAbhi Gupta
- UltimateGuidetoDataScienceInterviews-2Uploaded byAnonymous wt1Miztt3F
- Data Science Careers Guide Springboard FinalUploaded byarun.sambargi6220
- Introduction to Cloud Infrastructure TechnologiesUploaded byendermuabdib
- 90221_80827v00_machine_learning_section4_ebook_v03.pdfUploaded byviejopiscolero
- Machine Learning Section3 eBook v05Uploaded byhuze_nedian
- dvwkbm_lab2_cli1Uploaded byrakesharumalla
- IotUploaded byrakesharumalla
- Streaming AnalyticsUploaded byrakesharumalla
- AV Big Data Engineers PathUploaded byrakesharumalla
- vishnustotramUploaded bytmadduluri
- Scikit_Learn_Cheat_Sheet_Python.pdfUploaded byrakesharumalla
- R-Cheat SheetUploaded byPrasad Marathe
- Matlab Cheat SheetUploaded bylpauling
- R cheatsheet Data WranglingUploaded byarekbee
- Data Table Cheat SheetUploaded byAnthony Lee Zhang
- Python_Bokeh_Cheat_Sheet.pdfUploaded byrakesharumalla
- Python 3 Cheat SheetUploaded byAlex
- SaichalisaUploaded byrakesharumalla

- Deep learning machine appliesUploaded bySerag El-Deen
- Transformarea Securității Din Perspectiva Ipotezei Singularității TehnologiceUploaded bydr_mihai
- LSTMUploaded by潘 韋志
- 1-s2.0-S1110866515000638-mainUploaded byAthir Bangdes
- Deep Learning notesUploaded byMehul Patel
- Group 5 Assignment on Advantages and Disadvantages of Supervised Hebbian Learning Ver2 (2)Uploaded byMairos Kunze Bonga
- Fuzzy InferenceUploaded by1balamanian
- Recurrent Fully Convolutional Networks for Video SegmentationUploaded byjeffconnors
- Digital HardwaDigital-Hardware-Implementation-of-Artificial-Neural-Network-for-Signal-Processingre Implementation of Artificial Neural Network for Signal ProcessingUploaded bychikypuky
- Lecture 4 - Bias-Variance Trade-Off and Model SelectionUploaded byValentina Diamante
- 2.- A Note on the Equivalence of NARX and RNNUploaded bycristian_master
- Convolutional Networks and Applications in VisionUploaded bymuahahaha2
- Intro 2 NetlabUploaded byBhaumik Bhuva
- LSTM SPARK.docxUploaded byPrem Prasad
- Deep Learning EEG Response Representation for BrainUploaded bynavdeep
- Introduction to Artificial IntelligenceUploaded byIvan Avramov
- dlsvmUploaded byDandi Baroes
- R05320505-NEURAL-NETWORKS may 2008Uploaded byapi-19718334
- Neural Networks to Discover Image PatternsUploaded byManu Carbonell
- Lecture 7Uploaded byALIKNF
- Deep Learning in Sentiment AnalysisUploaded byermin_zvizdic
- tex_annUploaded byMarius_2010
- Review of Machine _ Deep Learning in an Artistic Context – Machine Intelligence Report – MediumUploaded bysavisu
- Not Just a Black Box Interpretable Deep Learning by Propagating Activation DifferencesUploaded byJan Hula
- NeuralNetworks-1Uploaded byIrwansyah Ab
- 5BUploaded byKarthik Love
- Draft-1Uploaded byNikhil Ratna Shakya
- Training RBF Networks With Selective BackpropagationUploaded bytall1
- Deep MpcUploaded bypremcharanraja
- Artificial Neural Networks_MiniProjectUploaded byamithbalu