Understanding Competitor Pricing Strategies with Web Scraping

Web Scraping

Dynamic pricing has been headed as the primary way to employ web scraping in ecommerce and retail. Pricing strategies in these industries, however, are multifactorial. While using scraping to match competitor prices can provide improvements in revenue, the data collection method can be used to gather more strategic information.

In general, prices, when collected at a large scale, can reflect a wider implementation of a strategy. These are especially pressing in retail, where promotional sales and discounts can be great drivers of revenue.

Admittedly, some of these may be implemented ad hoc. Yet, research shows that employing carefully curated pricing strategies in retail can deliver better results, even when simple discounts are implemented.


Breadth and depth of sales

In Web scraping retail food prices, authors propose that there’s a relationship between the depth (defined as the size of a particular discount) and breadth (the amount of products on sale). Managing these two at once can bring in greater revenue than focusing on each part independently.

After engaging in some complicated mathematical operations, the authors begin using web scraping on a retailer that lists between 200 to 600 products on sale each day. Data collected shows that the retailer in question implements a carefully selected combination of breadth and depth of sales.

Authors have proposed that focusing on a single aspect of such a promotional strategy might have unintended consequences. While they only focus on disparate aspects, I’ll add my own thoughts to explain certain combinations:

  • Low breadth, low depth. Unlikely to have a significant impact on shopping behavior.
  • High breadth, low depth. Might cause shoppers to experiment with different brands, otherwise no significant effect.
  • Low breadth, high depth. Might cause shoppers to start cherry picking products across retailers to maximize discount value.
  • High breadth, high depth. Could be beneficial to the business if carefully managed. Could be detrimental if implemented recklessly.

In any case, a supplementary goal of promotions and sales is to attract new customers. It is likely that if discounts were never implemented, only store-loyal customers would purchase products with a small number of newer ones testing out the brand.

As a result, a pricing strategy requires careful consideration and balance as it can lead to both beneficial and detrimental results. Getting data on how companies approach such a process might have been difficult previously, however, with web scraping, that’s now possible.


Web scraping competitors

Pricing data collection is a relatively streamlined process. As it’s such a popular use case for web scraping, most providers will have dedicated tools or APIs available that will make extraction easy.

Even when building a pricing data collection system in-house, the process can still be comparatively simple. While web scraping solutions generally have to deal with enormous amounts of unstructured data, price collection only necessitates a few data points, which can be easily parsed into a structured format.

While the easiest route is to use such data for dynamic pricing, it can be developed much further. As the research above outlines, pricing data can be mapped out to specific product categories with totals being calculated on a daily basis.

Knowing that the depth and breadth of price changes influence shopping behavior and, in turn, revenue, understanding how competitors are implementing their strategies can provide insight into how to optimize these factors. Even if no competitor is implementing a well-thought-out promotional and discounting strategy, some trends can be revealed through the data.

To get the best insights, total numbers of discounted products should be calculated, just as the authors did in the study. Additionally, for large ecommerce companies that sell several categories of products, separating them into their own sections could be beneficial as well.

With some historical data, trends will be revealed. In an ideal case, where competitors have a predefined promotional strategy, a specific range for sale breadth will be discovered. Uncovering sale depth might be slightly more complicated as the limits for discount sizes are more stringent. It is still, however, possible with enough data.


Mapping out data

From enough breadth and depth data, a promotional strategy, if there is one, can be derived. Measuring the impact of such strategies is, unfortunately, somewhat difficult but not entirely impossible.

There are some online retailers that display the amount of a specific product that’s held in stock. While these usually end at “1000+”, it’s possible to measure the correlation between predicted sale volume and the discount.

For such insights to be available, web scraping should be used to collect historical data about the stock numbers and the average daily delta (i.e. the change in held stock) should be measured. When such a product goes on sale, the deltas would be compared to find out whether there was any correlation.

A similar approach could be used on extremely popular products that have high review counts. Review count deltas could again be measured and compared whenever products go on sale.

Both of these approaches, however, have some drawbacks. They are both approximations as direct purchase volume is unavailable. With enough historical data, however, these approximations, in proper conditions, will be relatively insightful into whether pricing strategies bring in the expected benefit.

In the end, web scraping enables us to look deeper than direct price shifts in competitor websites. It can provide us with great insight into the deeper layers of pricing strategies and unveil the potential they might bring to our own businesses.