The Resource Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara
Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara
Resource Information
The item Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara represents a specific, individual, material embodiment of a distinct intellectual or artistic creation found in Sydney Jones Library, University of Liverpool.This item is available to borrow from 1 library branch.
Resource Information
The item Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara represents a specific, individual, material embodiment of a distinct intellectual or artistic creation found in Sydney Jones Library, University of Liverpool.
This item is available to borrow from 1 library branch.
- Summary
- Bridge the gap between a high-level understanding of how an algorithm works and knowing the nuts and bolts to tune your models better. This book will give you the confidence and skills when developing all the major machine learning models. In Pro Machine Learning Algorithms, you will first develop the algorithm in Excel so that you get a practical understanding of all the levers that can be tuned in a model, before implementing the models in Python/R. You will cover all the major algorithms: supervised and unsupervised learning, which include linear/logistic regression; k-means clustering; PCA; recommender system; decision tree; random forest; GBM; and neural networks. You will also be exposed to the latest in deep learning through CNNs, RNNs, and word2vec for text mining. You will be learning not only the algorithms, but also the concepts of feature engineering to maximize the performance of a model. You will see the theory along with case studies, such as sentiment classification, fraud detection, recommender systems, and image recognition, so that you get the best of both theory and practice for the vast majority of the machine learning algorithms used in industry. Along with learning the algorithms, you will also be exposed to running machine-learning models on all the major cloud service providers. You are expected to have minimal knowledge of statistics/software programming and by the end of this book you should be able to work on a machine learning project with confidence. You will: Get an in-depth understanding of all the major machine learning and deep learning algorithms Fully appreciate the pitfalls to avoid while building models Implement machine learning algorithms in the cloud Follow a hands-on approach through case studies for each algorithm Gain the tricks of ensemble learning to build more accurate models Discover the basics of programming in R/Python and the Keras framework for deep learning
- Language
- eng
- Extent
- 1 online resource
- Contents
-
- Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Basics of Machine Learning; Regression and Classification; Training and Testing Data; The Need for Validation Dataset; Measures of Accuracy; Absolute Error; Root Mean Square Error; Confusion Matrix; AUC Value and ROC Curve; Unsupervised Learning; Typical Approach Towards Building a Model; Where Is the Data Fetched From?; Which Data Needs to Be Fetched?; Pre-processing the Data; Feature Interaction; Feature Generation; Building the Models; Productionalizing the Models
- Build, Deploy, Test, and IterateSummary; Chapter 2: Linear Regression; Introducing Linear Regression; Variables: Dependent and Independent; Correlation; Causation; Simple vs. Multivariate Linear Regression; Formalizing Simple Linear Regression; The Bias Term; The Slope; Solving a Simple Linear Regression; More General Way of Solving a Simple Linear Regression; Minimizing the Overall Sum of Squared Error; Solving the Formula; Working Details of Simple Linear Regression; Complicating Simple Linear Regression a Little; Arriving at Optimal Coefficient Values; Introducing Root Mean Squared Error
- Running a Simple Linear Regression in RResiduals; Coefficients; SSE of Residuals (Residual Deviance); Null Deviance; R Squared; F-statistic; Running a Simple Linear Regression in Python; Common Pitfalls of Simple Linear Regression; Multivariate Linear Regression; Working details of Multivariate Linear Regression; Multivariate Linear Regression in R; Multivariate Linear Regression in Python; Issue of Having a Non-significant Variable in the Model; Issue of Multicollinearity; Mathematical Intuition of Multicollinearity; Further Points to Consider in Multivariate Linear Regression
- Assumptions of Linear RegressionSummary; Chapter 3: Logistic Regression; Why Does Linear Regression Fail for Discrete Outcomes?; A More General Solution: Sigmoid Curve; Formalizing the Sigmoid Curve (Sigmoid Activation); From Sigmoid Curve to Logistic Regression; Interpreting the Logistic Regression; Working Details of Logistic Regression; Estimating Error; Scenario 1; Scenario 2; Least Squares Method and Assumption of Linearity; Running a Logistic Regression in R; Running a Logistic Regression in Python; Identifying the Measure of Interest; Common Pitfalls
- Time Between Prediction and the Event HappeningOutliers in Independent variables; Summary; Chapter 4: Decision Tree; Components of a Decision Tree; Classification Decision Tree When There Are Multiple Discrete Independent Variables; Information Gain; Calculating Uncertainty: Entropy; Calculating Information Gain; Uncertainty in the Original Dataset; Measuring the Improvement in Uncertainty; Which Distinct Values Go to the Left and Right Nodes; Gini Impurity; Splitting Sub-nodes Further; When Does the Splitting Process Stop?; Classification Decision Tree for Continuous Independent Variables
- Isbn
- 9781484235638
- Label
- Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R
- Title
- Pro machine learning algorithms
- Title remainder
- a hands-on approach to implementing algorithms in Python and R
- Statement of responsibility
- V. Kishore Ayyadevara
- Language
- eng
- Summary
- Bridge the gap between a high-level understanding of how an algorithm works and knowing the nuts and bolts to tune your models better. This book will give you the confidence and skills when developing all the major machine learning models. In Pro Machine Learning Algorithms, you will first develop the algorithm in Excel so that you get a practical understanding of all the levers that can be tuned in a model, before implementing the models in Python/R. You will cover all the major algorithms: supervised and unsupervised learning, which include linear/logistic regression; k-means clustering; PCA; recommender system; decision tree; random forest; GBM; and neural networks. You will also be exposed to the latest in deep learning through CNNs, RNNs, and word2vec for text mining. You will be learning not only the algorithms, but also the concepts of feature engineering to maximize the performance of a model. You will see the theory along with case studies, such as sentiment classification, fraud detection, recommender systems, and image recognition, so that you get the best of both theory and practice for the vast majority of the machine learning algorithms used in industry. Along with learning the algorithms, you will also be exposed to running machine-learning models on all the major cloud service providers. You are expected to have minimal knowledge of statistics/software programming and by the end of this book you should be able to work on a machine learning project with confidence. You will: Get an in-depth understanding of all the major machine learning and deep learning algorithms Fully appreciate the pitfalls to avoid while building models Implement machine learning algorithms in the cloud Follow a hands-on approach through case studies for each algorithm Gain the tricks of ensemble learning to build more accurate models Discover the basics of programming in R/Python and the Keras framework for deep learning
- Cataloging source
- N$T
- http://library.link/vocab/creatorName
- Ayyadevara, V. Kishore
- Dewey number
- 006.31
- Index
- no index present
- LC call number
- Q325.5
- Literary form
- non fiction
- Nature of contents
- dictionaries
- http://library.link/vocab/subjectName
-
- Machine learning
- Python (Computer program language)
- R (Computer program language)
- Label
- Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara
- Antecedent source
- unknown
- Carrier category
- online resource
- Carrier category code
-
- cr
- Carrier MARC source
- rdacarrier
- Color
- multicolored
- Content category
- text
- Content type code
-
- txt
- Content type MARC source
- rdacontent
- Contents
-
- Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Basics of Machine Learning; Regression and Classification; Training and Testing Data; The Need for Validation Dataset; Measures of Accuracy; Absolute Error; Root Mean Square Error; Confusion Matrix; AUC Value and ROC Curve; Unsupervised Learning; Typical Approach Towards Building a Model; Where Is the Data Fetched From?; Which Data Needs to Be Fetched?; Pre-processing the Data; Feature Interaction; Feature Generation; Building the Models; Productionalizing the Models
- Build, Deploy, Test, and IterateSummary; Chapter 2: Linear Regression; Introducing Linear Regression; Variables: Dependent and Independent; Correlation; Causation; Simple vs. Multivariate Linear Regression; Formalizing Simple Linear Regression; The Bias Term; The Slope; Solving a Simple Linear Regression; More General Way of Solving a Simple Linear Regression; Minimizing the Overall Sum of Squared Error; Solving the Formula; Working Details of Simple Linear Regression; Complicating Simple Linear Regression a Little; Arriving at Optimal Coefficient Values; Introducing Root Mean Squared Error
- Running a Simple Linear Regression in RResiduals; Coefficients; SSE of Residuals (Residual Deviance); Null Deviance; R Squared; F-statistic; Running a Simple Linear Regression in Python; Common Pitfalls of Simple Linear Regression; Multivariate Linear Regression; Working details of Multivariate Linear Regression; Multivariate Linear Regression in R; Multivariate Linear Regression in Python; Issue of Having a Non-significant Variable in the Model; Issue of Multicollinearity; Mathematical Intuition of Multicollinearity; Further Points to Consider in Multivariate Linear Regression
- Assumptions of Linear RegressionSummary; Chapter 3: Logistic Regression; Why Does Linear Regression Fail for Discrete Outcomes?; A More General Solution: Sigmoid Curve; Formalizing the Sigmoid Curve (Sigmoid Activation); From Sigmoid Curve to Logistic Regression; Interpreting the Logistic Regression; Working Details of Logistic Regression; Estimating Error; Scenario 1; Scenario 2; Least Squares Method and Assumption of Linearity; Running a Logistic Regression in R; Running a Logistic Regression in Python; Identifying the Measure of Interest; Common Pitfalls
- Time Between Prediction and the Event HappeningOutliers in Independent variables; Summary; Chapter 4: Decision Tree; Components of a Decision Tree; Classification Decision Tree When There Are Multiple Discrete Independent Variables; Information Gain; Calculating Uncertainty: Entropy; Calculating Information Gain; Uncertainty in the Original Dataset; Measuring the Improvement in Uncertainty; Which Distinct Values Go to the Left and Right Nodes; Gini Impurity; Splitting Sub-nodes Further; When Does the Splitting Process Stop?; Classification Decision Tree for Continuous Independent Variables
- Dimensions
- unknown
- Extent
- 1 online resource
- File format
- unknown
- Form of item
- online
- Isbn
- 9781484235638
- Level of compression
- unknown
- Media category
- computer
- Media MARC source
- rdamedia
- Media type code
-
- c
- Quality assurance targets
- not applicable
- Reformatting quality
- unknown
- Sound
- unknown sound
- Specific material designation
- remote
- System control number
-
- on1042561229
- (OCoLC)1042561229
- Label
- Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara
- Antecedent source
- unknown
- Carrier category
- online resource
- Carrier category code
-
- cr
- Carrier MARC source
- rdacarrier
- Color
- multicolored
- Content category
- text
- Content type code
-
- txt
- Content type MARC source
- rdacontent
- Contents
-
- Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Basics of Machine Learning; Regression and Classification; Training and Testing Data; The Need for Validation Dataset; Measures of Accuracy; Absolute Error; Root Mean Square Error; Confusion Matrix; AUC Value and ROC Curve; Unsupervised Learning; Typical Approach Towards Building a Model; Where Is the Data Fetched From?; Which Data Needs to Be Fetched?; Pre-processing the Data; Feature Interaction; Feature Generation; Building the Models; Productionalizing the Models
- Build, Deploy, Test, and IterateSummary; Chapter 2: Linear Regression; Introducing Linear Regression; Variables: Dependent and Independent; Correlation; Causation; Simple vs. Multivariate Linear Regression; Formalizing Simple Linear Regression; The Bias Term; The Slope; Solving a Simple Linear Regression; More General Way of Solving a Simple Linear Regression; Minimizing the Overall Sum of Squared Error; Solving the Formula; Working Details of Simple Linear Regression; Complicating Simple Linear Regression a Little; Arriving at Optimal Coefficient Values; Introducing Root Mean Squared Error
- Running a Simple Linear Regression in RResiduals; Coefficients; SSE of Residuals (Residual Deviance); Null Deviance; R Squared; F-statistic; Running a Simple Linear Regression in Python; Common Pitfalls of Simple Linear Regression; Multivariate Linear Regression; Working details of Multivariate Linear Regression; Multivariate Linear Regression in R; Multivariate Linear Regression in Python; Issue of Having a Non-significant Variable in the Model; Issue of Multicollinearity; Mathematical Intuition of Multicollinearity; Further Points to Consider in Multivariate Linear Regression
- Assumptions of Linear RegressionSummary; Chapter 3: Logistic Regression; Why Does Linear Regression Fail for Discrete Outcomes?; A More General Solution: Sigmoid Curve; Formalizing the Sigmoid Curve (Sigmoid Activation); From Sigmoid Curve to Logistic Regression; Interpreting the Logistic Regression; Working Details of Logistic Regression; Estimating Error; Scenario 1; Scenario 2; Least Squares Method and Assumption of Linearity; Running a Logistic Regression in R; Running a Logistic Regression in Python; Identifying the Measure of Interest; Common Pitfalls
- Time Between Prediction and the Event HappeningOutliers in Independent variables; Summary; Chapter 4: Decision Tree; Components of a Decision Tree; Classification Decision Tree When There Are Multiple Discrete Independent Variables; Information Gain; Calculating Uncertainty: Entropy; Calculating Information Gain; Uncertainty in the Original Dataset; Measuring the Improvement in Uncertainty; Which Distinct Values Go to the Left and Right Nodes; Gini Impurity; Splitting Sub-nodes Further; When Does the Splitting Process Stop?; Classification Decision Tree for Continuous Independent Variables
- Dimensions
- unknown
- Extent
- 1 online resource
- File format
- unknown
- Form of item
- online
- Isbn
- 9781484235638
- Level of compression
- unknown
- Media category
- computer
- Media MARC source
- rdamedia
- Media type code
-
- c
- Quality assurance targets
- not applicable
- Reformatting quality
- unknown
- Sound
- unknown sound
- Specific material designation
- remote
- System control number
-
- on1042561229
- (OCoLC)1042561229
Library Links
Embed
Settings
Select options that apply then copy and paste the RDF/HTML data fragment to include in your application
Embed this data in a secure (HTTPS) page:
Layout options:
Include data citation:
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.liverpool.ac.uk/portal/Pro-machine-learning-algorithms--a-hands-on/MvQkYG4W-AU/" typeof="Book http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.liverpool.ac.uk/portal/Pro-machine-learning-algorithms--a-hands-on/MvQkYG4W-AU/">Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara</a></span> - <span property="potentialAction" typeOf="OrganizeAction"><span property="agent" typeof="LibrarySystem http://library.link/vocab/LibrarySystem" resource="http://link.liverpool.ac.uk/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.liverpool.ac.uk/">Sydney Jones Library, University of Liverpool</a></span></span></span></span></div>
Note: Adjust the width and height settings defined in the RDF/HTML code fragment to best match your requirements
Preview
Cite Data - Experimental
Data Citation of the Item Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara
Copy and paste the following RDF/HTML data fragment to cite this resource
<div class="citation" vocab="http://schema.org/"><i class="fa fa-external-link-square fa-fw"></i> Data from <span resource="http://link.liverpool.ac.uk/portal/Pro-machine-learning-algorithms--a-hands-on/MvQkYG4W-AU/" typeof="Book http://bibfra.me/vocab/lite/Item"><span property="name http://bibfra.me/vocab/lite/label"><a href="http://link.liverpool.ac.uk/portal/Pro-machine-learning-algorithms--a-hands-on/MvQkYG4W-AU/">Pro machine learning algorithms : a hands-on approach to implementing algorithms in Python and R, V. Kishore Ayyadevara</a></span> - <span property="potentialAction" typeOf="OrganizeAction"><span property="agent" typeof="LibrarySystem http://library.link/vocab/LibrarySystem" resource="http://link.liverpool.ac.uk/"><span property="name http://bibfra.me/vocab/lite/label"><a property="url" href="http://link.liverpool.ac.uk/">Sydney Jones Library, University of Liverpool</a></span></span></span></span></div>