How to Build Your Own Custom Price Tracker for Best Buy Products

Do you ever find yourself constantly checking prices for a particular product you want to buy, hoping to catch a price drop? As an avid deal seeker myself, I know how tedious and time-consuming this can be.

Luckily, with some Python code and web scraping skills, we can automate the process of tracking Best Buy prices!

In this comprehensive guide, I‘ll walk you through step-by-step how to build your own custom price tracker for any Best Buy product. We‘ll scrape product data, analyze price history, get email alerts for price drops, and schedule daily updates.

Why Track Prices Programmatically?

Before we dive into the code, let‘s first talk about why you may want to track prices programmatically instead of manually.

Studies have shown that 79% of consumers comparison shop across multiple websites before making a purchase. People want to feel assured they are getting the best possible deal.

Manually checking prices across different sites is extremely laborious:

  • Visiting each product page takes time, especially on slower connections.

  • It‘s difficult to compare prices from memory or scattered browser tabs.

  • Tracking price history over weeks or months is nearly impossible.

  • There‘s no way to get notified immediately if a price drops unless you happen to be checking at that moment.

A programmatic solution solves these issues by scraping product data at scale and automating analysis and notifications.

Some key benefits of building your own price tracker include:

  • Faster price checks – Scrape many product pages in parallel.

  • Unified data – All price info in one place for easy analysis.

  • Price history – Visually track prices over time.

  • Notifications – Receive real-time alerts on price drops.

  • Automation – Run checks on a schedule without manual effort.

  • Customizations – Tailor tracking to your specific needs.

For power shoppers, having access to this data can mean snagging amazing deals others will miss out on!

Next, let‘s go over some options for tracking prices programmatically before diving into the code.

Comparing Price Tracking Approaches

There are a few different approaches we can use for automated price tracking:

Browser Extensions

Extensions like Honey and CamelCamelCamel provide basic price tracking and history directly on product pages. However, they are limited to data from Amazon.

Product APIs

Some stores provide APIs to fetch product pricing data. But availability is inconsistent, and usually requires registration.

Web Scraping

Scraping product details directly from HTML allows tracking prices from any site. The page data is always up-to-date.

For maximum flexibility and coverage, web scraping typically works best. Next, we‘ll use Python for scrape-based tracking.

Step 1 – Install Python Libraries

Let‘s outline the core libraries we‘ll use for fetching, analyzing, and alerting on price data:

Requests – Makes HTTP requests to fetch product pages. More robust than using the basic urllib module.

BeautifulSoup – Parses HTML and XML documents returned by Requests. We can extract data using CSS selectors.

Pandas – Provides DataFrames for convenient data analysis and CSV reading/writing.

Matplotlib – Powerful Python plotting library we‘ll use for price history charts.

smtplib – Sends email notifications so we get alerts on price drops.

schedule – Enables running our tracker on a fixed schedule, like daily checks.

Install these libraries using pip:

pip install requests beautifulsoup4 pandas matplotlib smtplib schedule

We‘ll import these at the top of our Python script later on.

Step 2 – Fetch Best Buy Product Data

The first step our price tracker bot needs to do is fetch the product data from BestBuy.com pages.

For each product, we want to scrape details like:

  • Product title
  • Current price
  • Price history (how much it used to cost)
  • Availability status

Best Buy helpfully provides much of this data right on product pages.

Let‘s create a function get_product_data() that takes in a Best Buy product URL, makes the HTTP request, and extracts the details we need:

from bs4 import BeautifulSoup
import requests

def get_product_data(url):

  headers = {
    "User-Agent": "My Price Tracker Bot v1.0"  
  }

  page = requests.get(url, headers=headers)

  soup = BeautifulSoup(page.content, ‘html.parser‘)

  title = soup.find(id="sku-title").get_text()

  price = soup.find(class_="sku-list-price").get_text()

  availability = soup.find(class="add-to-cart-button").get_text()

  # Extract full pricing history from table 
  history = []
  for price_row in soup.find(id="pricing-history-table").find_all(‘tr‘):
    date = price_row.find(‘td‘, class_="pricing-history-table__date").get_text()  
    price = price_row.find(‘td‘, class_="pricing-history-table__price").get_text()
    history.append( (date, price) )

  data = {
    ‘title‘: title,
    ‘price‘: price,
    ‘availability‘: availability,
    ‘price_history‘: history
  }

  return data

Here we:

  • Set a custom User-Agent header to identify our scraper bot
  • Fetch the page HTML with Requests
  • Parse the HTML with BeautifulSoup
  • Find and extract key elements like title, price, availability using IDs and CSS classes
  • Loop through the pricing history table rows to get date and price data

Finally we return all the data in a nice Python dictionary.

Let‘s test it by fetching data for a Galaxy S21 phone:

product_url = "https://www.bestbuy.com/site/samsung-galaxy-s21-5g-128gb-phantom-gray-verizon/6426298.p?skuId=6426298"

product_data = get_product_data(product_url)

print(product_data)

This would return a dictionary containing the S21‘s title, price, availability, and full pricing history.

With get_product_data() our script can now easily fetch key details for any Best Buy product. Next let‘s look at saving the history.

Step 3 – Save Price History to CSV

Now that we can extract price history data, let‘s save it to a CSV file. This will allow us to accumulate price data over time for further analysis.

We‘ll need to:

  • Check if the CSV file already exists, if not create it
  • Append each new price data point as a row
  • Save the updated CSV

This ensures we have the full pricing history in one place.

import csv
from datetime import date

def save_price_history(data):

  today = date.today().strftime("%m/%d/%Y")  

  filename = "price_history.csv"
  fields = [‘title‘, ‘price‘, ‘date‘]

  # Create file if doesn‘t exist 
  file_exists = os.path.exists(filename)
  with open(filename, ‘a‘, newline=‘‘) as f:

    writer = csv.writer(f)  

    # Write header row if new file
    if not file_exists: 
      writer.writerow(fields)   

    # Append new price data    
    row = [data[‘title‘], data[‘price‘], today]
    writer.writerow(row)

We can now call this after fetching data:

product_data = get_product_data(url)
save_price_history(product_data) 

This will gradually build up a local CSV with historical pricing data for analysis.

Next let‘s look at ways to analyze the data and get notified on changes.

Step 4 – Plot Price History Charts with Matplotlib

Visualizing the price history over time can be extremely helpful for spotting trends.

The Python library Matplotlib makes plotting the CSV data easy.

First we‘ll load the CSV using Pandas, then plot price on the Y-axis and date on the X-axis:

import pandas as pd
from matplotlib import pyplot as plt

# Load in our CSV  
df = pd.read_csv("price_history.csv")

# Plot prices over time
plt.plot(df[‘date‘], df[‘price‘])  

# Add labels and title
plt.xlabel(‘Date‘)
plt.ylabel(‘Price‘)
plt.title(df[‘title‘][0]) # Title from first row 

# Display chart
plt.show()

Now we have a nice price history chart for analysis!

Price History Chart

From this we can easily spot price drops, which are key buying opportunities.

Some other useful Matplotlib chart types for price data include:

  • Bar charts for comparing prices across products
  • Scatter plots to analyze price vs. time
  • Multiple plots for overlaying pricing trends

Visualizations reveal insights that would be nearly impossible to see in raw CSV data.

Next let‘s look at getting notifications when key events occur, like price drops.

Step 5 – Get Alerts on Price Changes via Email

Staring at price history charts waiting for drops isn‘t an efficient use of time.

For instant notifications when prices change, we can use Python to send automated emails.

The smtplib module makes sending emails simple with just a few lines of code:

import smtplib

# Check for price drop
current_price = get_current_price()
last_price = get_last_price()

if current_price < last_price:

  # Send email alert
  server = smtplib.SMTP(‘smtp.gmail.com‘, 587)
  server.ehlo()
  server.starttls()
  server.ehlo()

  server.login(‘[email protected]‘, ‘password‘)

  subject = "Price dropped!"
  body = f"The price for {product_url} has dropped from {last_price} to {current_price}"

  msg = f"Subject: {subject}\n\n{body}"

  server.sendmail(
    ‘[email protected]‘,
    ‘[email protected]‘, 
    msg
  )

  print(‘Email sent!‘)

  server.quit()

We first check if the current price is lower than the last price we have recorded. If so, we connect to Gmail via SMTP, log in, and send a custom email alert.

Now anytime a price drops, we‘ll receive an email immediately letting us know!

Some other ways to get notifications beyond email:

  • Send SMS alerts with Twilio
  • Create mobile push notifications with Pushbullet
  • Post alerts to Slack/Discord channels

Emails provide a simple starting point for monitoring critical events.

Step 6 – Schedule Regular Price Checks

The final piece of our Best Buy price tracker is automating regular checks.

We want our bot to run automatically on a schedule, like every day, rather than needing to manually trigger it.

The Python schedule library makes this a breeze:

import schedule
import time

def price_check():
  # Run main price check logic...

# Schedule price check  
schedule.every().day.at("9:00").do(price_check)

while True:
  schedule.run_pending()
  time.sleep(1)

This will now run our price_check() function every day at 9 AM to fetch the latest data.

Some other scheduling options include:

  • Run every X minutes/hours with schedule.every(5).minutes.do()
  • Run on certain days with schedule.every().monday.do()
  • Chain multiple schedules like daily + hourly

Scheduled jobs ensure we automatically have the most up-to-date price data at all times!

Summary

And there you have it – a fully functional custom price tracker for Best Buy products using Python!

Here are some key points:

  • Web scraping with Requests and BeautifulSoup provides flexible price data extraction from any site.

  • Storing history in CSVs enables further analysis and charting with Pandas/Matplotlib.

  • Plotting visualizations reveals insights like price drops.

  • Email and SMS alerts enable real-time notifications when prices change.

  • Scheduled jobs provide automated daily price updates.

While our tutorial focused on Best Buy, these principles can be applied to build scrapers for any ecommerce site like Amazon, Walmart, Target, and more.

The code is also highly expandable. Some next steps could include:

  • Scraping more data like ratings, reviews, images etc.

  • Switching to a real database like MySQL or MongoDB for more storage flexibility.

  • Building a web interface or mobile app for the tracker.

  • Deploying the scraper to a server or container for 24/7 operations.

  • Using proxies and Celery workers for distributed scraping at scale.

I hope you found this guide helpful and actionable! Let me know if you have any other questions.

Happy deal hunting!

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.