access

lifetime

content

4.5 Hours

**Description**

- Access 40 lectures & 4.5 hours of content 24/7
- Use gradient descent to solve for the optimal parameters of a Hidden Markov Model
- Learn how to work w/ sequences in Theano
- Calculate models of sickness & health
- Analyze how people interact w/ a website using Markov Models
- Explore Google's PageRank algorithm
- Generate images & discuss smartphone autosuggestions using HMMs

The Lazy Programmer is a data scientist, big data engineer, and full stack software engineer. For his master's thesis he worked on brain-computer interfaces using machine learning. These assist non-verbal and non-mobile persons to communicate with their family and caregivers.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

He has worked in online advertising and digital media as both a data scientist and big data engineer, and built various high-throughput web services around said data. He has created new big data pipelines using Hadoop/Pig/MapReduce, and created machine learning models to predict click-through rate, news feed recommender systems using linear regression, Bayesian Bandits, and collaborative filtering and validated the results using A/B testing.

He has taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Humber College, and The New School.

Multiple businesses have benefitted from his web programming expertise. He does all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies he has used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases he has used MySQL, Postgres, Redis, MongoDB, and more.

Details & Requirements

- Length of time users can access this course: lifetime
- Access options: web streaming, mobile streaming
- Certification of completion not included
- Redemption deadline: redeem your code within 30 days of purchase
- Experience level required: all levels, but knowledge of Python and Numpy coding is expected
- All code for this course is available for download
*here*, in the directory hmm_class

Compatibility

- Internet required

**Terms**

- Instant digital redemption

- Introduction and Outline
- Introduction and Outline: Why would you want to use an HMM? (4:04)
- Unsupervised or Supervised? (2:58)

- Markov Models
- The Markov Property (4:39)
- Markov Models (4:50)
- The Math of Markov Chains (5:13)

- Markov Models: Example Problems and Applications
- Example Problem: Sick or Healthy (3:26)
- Example Problem: Expected number of continuously sick days (2:53)
- Example application: SEO and Bounce Rate Optimization (8:53)
- Example Application: Build a 2nd-order language model and generate phrases (13:06)
- Example Application: Google’s PageRank algorithm (5:04)

- Hidden Markov Models for Discrete Observations
- From Markov Models to Hidden Markov Models (6:02)
- HMMs are Doubly Embedded (1:59)
- How can we choose the number of hidden states? (4:22)
- The Forward-Backward Algorithm (4:27)
- Visual Intuition for the Forward Algorithm (3:32)
- The Viterbi Algorithm (2:57)
- Visual Intuition for the Viterbi Algorithm (3:16)
- The Baum-Welch Algorithm (2:38)
- Baum-Welch Explanation and Intuition (6:34)
- Baum-Welch Updates for Multiple Observations (4:53)
- Discrete HMM in Code (20:33)
- The underflow problem and how to solve it (5:05)
- Discrete HMM Updates in Code with Scaling (11:53)
- Scaled Viterbi Algorithm in Log Space (3:38)
- Gradient Descent Tutorial (4:30)
- Theano Scan Tutorial (12:40)
- Discrete HMM in Theano (11:42)

- HMMs for Continuous Observations
- Gaussian Mixture Models with Hidden Markov Models (4:12)
- Generating Data from a Real-Valued HMM (6:35)
- Continuous-Observation HMM in Code (part 1) (18:38)
- Continuous-Observation HMM in Code (part 2) (5:12)
- Continuous HMM in Theano (16:32)

- HMMs for Classification
- Generative vs. Discriminative Classifiers (2:30)
- HMM Classification on Poetry Data (Robert Frost vs. Edgar Allan Poe) (10:36)

- Bonus Example: Parts-of-Speech Tagging
- Parts-of-Speech Tagging Concepts (5:00)
- POS Tagging with an HMM (5:58)

- Appendix
- Review of Gaussian Mixture Models (3:04)
- Theano Tutorial (7:47)
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:22)

access

lifetime

content

4.5 Hours