95-865: Unstructured Data Analytics
(Fall 2025 Mini 2)

Unstructured Data Analytics

Lectures:
Note that the current plan is for Section C2 to be recorded.

Recitations (shared across sections): Fridays 2pm-3:20pm, HBH A301

Instructor: George Chen (email: georgechen ♣ cmu.edu) ‐ replace "♣" with the "at" symbol

Teaching assistants:

Office hours (starting second week of class): Check the course Canvas homepage for the office hour times and locations.

Contact: Please use Piazza (follow the link to it within Canvas) and, whenever possible, post so that everyone can see (if you have a question, chances are other people can benefit from the answer as well!).

Course Description

Companies, governments, and other organizations now collect massive amounts of data such as text, images, audio, and video. How do we turn this heterogeneous mess of data into actionable insights? A common problem is that we often do not know what structure underlies the data ahead of time, hence the data often being referred to as "unstructured". This course takes a practical approach to unstructured data analysis via a two-step approach:

  1. We first examine how to identify possible structure present in the data via visualization and other exploratory methods.
  2. Once we have clues for what structure is present in the data, we turn toward exploiting this structure to make predictions.
Many examples are given for how these methods help solve real problems faced by organizations. Along the way, we encounter many of the most popular methods in analyzing unstructured data, from modern classics in manifold learning, clustering, and topic modeling to some of the latest developments in deep neural networks for analyzing text, images, and time series.

We will be coding lots of Python and dabble a bit with GPU computing (Google Colab).

Note regarding GenAI (including large language models): As likely all of you are aware, there are now technologies like (Chat)GPT, Gemini, Claude, Llama, DeepSeek, etc which will all be getting better over time. If you use any of these in your homework, please cite them. For the purposes of the class, I will view these as external collaborators (no different than if you got help from a human friend). For exams, I want to make sure that you actually understand the material and are not just telling me what someone else (human or not) knows. This is important so that in the future, if you get help from an AI assistant (or a human) to aid you in your unstructured data analysis, you have enough background knowledge to check for yourself whether you think the AI (or human) is giving you a solution that is correct or not. For this reason, exams in this class will explicitly not allow electronics.

Prerequisite: If you are a Heinz student, then you must have taken 95-888 "Data-Focused Python" or 90-819 "Intermediate Programming with Python". If you are not a Heinz student and would like to take the course, please contact the instructor and clearly state what Python courses you have taken/what Python experience you have.

Helpful but not required: Math at the level of calculus and linear algebra may help you appreciate some of the material more

Grading: Homework (30%), Quiz 1 (35%), Quiz 2 (35%*)

*Students with the most instructor-endorsed posts on Piazza will receive a slight bonus at the end of the mini, which will be added directly to their Quiz 2 score (a maximum of 10 bonus points, so that it is possible to get 110 out of 100 points on Quiz 2).

Letter grades are determined based on a curve.

Calendar for Sections A2/B2/C2 (tentative)

Previous version of course (including lecture slides and demos): 95-865 Spring 2025 mini 4

Date Topic Supplemental Materials
Part I. Exploratory data analysis
Week 1
Mon Oct 20/Tue Oct 21 Lecture 1: Course overview, analyzing text using frequencies
[slides]

Wed Oct 22/Thur Oct 23 Lecture 2: Basic text analysis demo (requires Anaconda Python 3 & spaCy)
[lecture slides]
[slides on how to install Anaconda Python 3 and spaCy (needed for HW1 and lecture demos)]
Note: Anaconda Python 3 includes support for Jupyter notebooks, which we use extensively in this class
[Jupyter notebook (basic text analysis)]

HW1 released (check Canvas)
Fri Oct 24 Recitation slot: Lecture 3 — Basic text analysis (cont'd), co-occurrence analysis
[slides]
[Jupyter notebook (basic text analysis using arrays)]
[Jupyter notebook (co-occurrence analysis toy example)]
As we saw in class, PMI is defined in terms of log probabilities. Here's additional reading that provides some intuition on log probabilities (technical):
[Section 1.2 of lecture notes from CMU 10-704 "Information Processing and Learning" Lecture 1 (Fall 2016) discusses "information content" of random outcomes, which are in terms of log probabilities]
Week 2
Mon Oct 27/Tue Oct 28 Lecture 4: Co-occurrence analysis (cont'd), visualizing high-dimensional data with PCA
[slides]
[Jupyter notebook (text generation using n-grams)]
Additional reading (technical):
[Abdi and Williams's PCA review]

Supplemental videos:

[StatQuest: PCA main ideas in only 5 minutes!!!]
[StatQuest: Principal Component Analysis (PCA) Step-by-Step (note that this is a more technical introduction than mine using SVD/eigenvalues)]
[StatQuest: PCA - Practical Tips]
[StatQuest: PCA in Python (note that this video is more Pandas-focused whereas 95-865 is taught in a manner that is more numpy-focused to better prep for working with PyTorch later)]
Wed Oct 29/Thur Oct 30 Lecture 5: PCA (cont'd), manifold learning (Isomap, MDS)
[slides]
[Jupyter notebook (PCA)]
Additional reading (technical):
[The original Isomap paper (Tenenbaum et al 2000)]

Python examples for manifold learning:

[scikit-learn example (Isomap, t-SNE, and many other methods)]
Fri Oct 31 Recitation slot: More on dimensionality reduction
[slides (how to save a Jupyter notebook as PDF)]
[Jupyter notebook (more on PCA, argsort)]
[Jupyter notebook (manifold learning)]
[Jupyter notebook (analyzing the 20 Newsgroups dataset)]
Week 3
Mon Nov 3/Tue Nov 4 HW1 due Monday Nov 3, 11:59pm

No lecture on Mon Nov 3/Tue Nov 4 (this is to keep the three sections of the class synced and to account for one of them not being held due to CMU's observance of Democracy Day)
Wed Nov 5/Thur Nov 6 Lecture 6: Wrap up manifold learning, intro to clustering
[slides]
[continuation of demo from last Friday's recitation: Jupyter notebook (manifold learning)]
[Jupyter notebook (PCA and t-SNE with images)***]
***For the demo on PCA and t-SNE with images to work, you will need to install some packages:
pip install torch torchvision
[required reading (not covered in lecture but you're expected to understand the coverage here): "How to Use t-SNE Effectively" (Wattenberg et al 2016)]

HW2 released (check Canvas)
Supplemental video:
[StatQuest: t-SNE, clearly explained]

Additional reading (technical):

[some technical slides on t-SNE by George for 95-865]
[Simon Carbonnelle's much more technical t-SNE slides]
[t-SNE webpage]
Fri Nov 7 Recitation slot: Lecture 7 — Clustering
[slides]
[Jupyter notebook (preprocessing 20 Newsgroups dataset)]
[Jupyter notebook (clustering 20 Newsgroups dataset)]
Additional reading on clustering(technical):
[see Section 14.3 of the book "Elements of Statistical Learning"]

Supplemental video:

[StatQuest: K-means clustering (note: the elbow method is specific to using total variation (i.e., residual sum of squares) as a score function; the elbow method is not always the approach you should use with other score functions)
Week 4
Mon Nov 10/Tue Nov 11 Lecture 8: Clustering (cont'd)
[slides]
[we resume the demo from last time: Jupyter notebook (clustering 20 Newsgroups dataset)]

Tue Nov 11, 8pm-9:30pm: optional Quiz 1 review session (check Canvas for Zoom link)
Same supplemental materials as the previous lecture
Wed Nov 12/Thur Nov 13 Lecture 9: Wrap up clustering, topic modeling
[slides]
[we resume the demo from last time: Jupyter notebook (clustering 20 Newsgroups dataset)]
[required reading (not covered in lecture but you're expected to understand the coverage here): Jupyter notebook (toy GMM example to show when CH index actually works)]
[Jupyter notebook (clustering on text revisited using TF-IDF, normalizing using Euclidean norm)]
[required reading (not covered in lecture but you're expected to understand the coverage here): Jupyter notebook (clustering with images)]
[Jupyter notebook (topic modeling with LDA)]
Topic modeling reading:
[David Blei's general intro to topic modeling]
[Maria Antoniak's practical guide for using LDA]
Fri Nov 14 Recitation slot: Quiz 1 (80-minute exam) — material coverage is up to and including last Friday's (Nov 7) recitation
Part II. Predictive data analysis
Week 5
Mon Nov 17/Tue Nov 18 Lecture 10: Wrap up topic modeling; intro to predictive data analysis
[slides]
[Jupyter notebook (LDA: choosing the number of topics)]
Wed Nov 19/Thur Nov 20 Lecture 11: wrap up intro predictive data analysis; intro to neural nets & deep learning
[slides]
[Jupyter notebook (prediction and model validation)]
Additional reading on basic neural networks:
[Chapter 1 "Using neural nets to recognize handwritten digits" of the book Neural Networks and Deep Learning]

Video introduction on neural nets:

["But what *is* a neural network? | Chapter 1, deep learning" by 3Blue1Brown]

StatQuest series of videos on neural nets and deep learning:

[YouTube playlist (note: there are a lot of videos in this playlist, some of which goes into more detail than you're expected to know for 95-865; make sure that you understand concepts at the level of how they are presented in 95-865 lectures/recitations)]
Fri Nov 21 Recitation slot: Some key concepts for prediction
[slides]
[Jupyter notebook]
Week 6
Mon Nov 24/Tue Nov 25 HW2 due Monday Nov 24, 11:59pm

Lecture 12: Wrap up neural net basics; image analysis with convolutional neural nets (also called CNNs or convnets)
[slides]
For the below neural net demo below to work, you will need to install some packages:
pip install torch torchvision torchinfo
[Jupyter notebook (handwritten digit recognition with neural nets; be sure to scroll to use the "Download ZIP" link to download all the files (especially so that you also download UDA_pytorch_utils.py))]
PyTorch tutorial (at the very least, go over the first page of this tutorial to familiarize yourself with going between NumPy arrays and PyTorch tensors, and also understand the basic explanation of how tensors can reside on either the CPU or a GPU):
[PyTorch tutorial]

Supplemental reading and video for convolutional neural networks (CNNs):

[Stanford CS231n Convolutional Neural Networks for Visual Recognition]
[(technical) Richard Zhang's fix for max pooling]
In the StatQuest YouTube playlist (from the previous lecture's supplemental materials), there's a video in the playlist on CNNs
Wed Nov 26—Fri Nov 28 No class (Thanksgiving holiday)
🦃
Week 7
Mon Dec 1/Tue Dec 2 Lecture 13: Wrap up CNNs; start time series analysis with recurrent neural nets (RNNs)
[slides]
[we resume the demo from last time: Jupyter notebook (handwritten digit recognition with neural nets; be sure to scroll to the bottom to download UDA_pytorch_utils.py)]
See the supplemental materials from the previous lecture; note that in the StatQuest neural net and deep learning YouTube playlist (in supplemental materials for last lecture; there's a video in the playlist on RNNs)
Wed Dec 3/Thur Dec 4 Lecture 14: Wrap up RNNs; glimpse of word embeddings; start coverage on text generation
[slides]
For the neural net demos below to work, you will need to install the Hugging Face transformers package (in addition to the packages needed to run previous neural net demos):
pip install transformers
[required reading: Jupyter notebook (quick intro on how to use BERT/BERT-Tiny)]
[required reading: Jupyter notebook (sentiment analysis with IMDb reviews; requires UDA_pytorch_utils.py from the previous lecture's demo)]
[required reading: Jupyter notebook (variant of the sentiment analysis RNN demo that learns a static word embedding (no BERT-Tiny) and uses a vanilla ReLU RNN (not an LSTM); even though the resulting model does not work as well as the one in previous demo, this notebook can be helpful to better understand what's going on with a much simpler model)]
BERT word embeddings (technical):
[A tutorial on BERT word embeddings]

Extra notebooks:

[Jupyter notebook (slight variant on the sentiment analysis RNN demo where the BERT-Tiny model is treated as frozen, so that model training only learns parameters for the LSTM and Linear layers)]
Fri Dec 5 Recitation slot: Lecture 15 — Text generation with generative pretrained transformers; course wrap-up
[slides]
[Jupyter notebook (text generation with neural nets)]
Additional reading/videos:
[Andrej Karpathy's "Neural Networks: Zero to Hero" lecture series (including a more detailed GPT lecture)]

Software for explaining neural nets:

[Captum]

Some articles on being careful with explanation methods (technical):

["The Disagreement Problem in Explainable Machine Learning: A Practitioner's Perspective" (Krishna et al 2022)]
["Do Feature Attribution Methods Correctly Attribute Features?" (Zhou et al 2022)]
["The false hope of current approaches to explainable artificial intelligence in health care" (Ghassemi et al 2021)]
Final exam week
Mon Dec 8 HW3 due 11:59pm
Fri Dec 12 Quiz 2 (80-minute exam): 1pm-2:20pm, HBH 1204 & HBH 1206 (yes, there are two rooms that are next to each other; please go to either one and try to space yourselves out a bit)

Quiz 2 focuses on material from weeks 4–7 (note that by how the course is set up, material from weeks 4–7 naturally at times relates to material from weeks 1–3, so some ideas in these earlier weeks could still possibly show up on Quiz 2— please focus your studying on material from weeks 4–7)