Scrape data from Walmart
Updated over a week ago

You are browsing a tutorial guide for the latest Octoparse version. If you are running an older version of Octoparse, we strongly recommend you upgrade because it is faster, easier, and more robust! Download and upgrade here if you haven't already done so!

Walmart is a large retail corporation in the United States. In this tutorial, we are going to show you how to scrape product data from Walmart.com.

You can also go to "Template Gallery" on the main screen of the Octoparse scraping tool, and start with the ready-to-use Walmart Template directly to save your time. With this feature, there is no need to configure scraping tasks. For further details, you may check it out here: Task Templates

2021-09-26_9-37-48.png

If you would like to know how to build the task from scratch, you may continue reading the following tutorial.

Suppose we want to scrape some specific information about headphones, and we can start with the home page (https://www.walmart.com/) to create our crawler. We will scrape data such as product title, price, product ID, and reviews from the product details page with Octoparse.

The main steps are shown in the menu on the right, and you can download the sample task file here.


1. Open the target web page

  • Enter the URL on the home page and click Start

2.png
  • Click the search box and then click Enter text on the Tips panel

3.png
  • Type "Headphone" and confirm

4.png
  • Click on Enter Text and set as to hit the Enter/Return key, then click "Apply" to confirm

5.png

2. Create a Pagination - to scrape from multiple pages

  • Click on the Next Page button, select Loop click single element, and set up the AJAX timeout as 10s

4.gif

The auto-generated XPath for Pagination does not always work in this case, so we need to modify the XPath to make it scrape all the pages.

  • Click on Pagination

  • Input the XPath //a[@aria-label="Next Page"] in the Matching XPath box

  • Click Apply to confirm

pagination_Xpath.jpg

3. Scrape data from the product list

  • Select the first product (note to include the whole product section)

  • Choose Select all sub-elements

select_product.jpg
  • Choose Select all

select_all.jpg
  • Choose Extract Data

Extract_data.jpg

Now, a Loop Item with Extract Data will be created in the workflow

Loop_Item.jpg
  • Double click the field name to rename it or click ... to delete unwanted fields

rename.jpg

If all the data you want can be scraped from the listing page, you can jump to step 6. Run extraction - run your task and get data


4. Click into each product link to scrape data - to get data from product pages

Some information like product descriptions can only be grabbed from the product detail page. We need to click on each product link to get the data.

  • Click on the first product link

  • Choose Click URL

click_uRL.jpg

A click item will be created in the workflow:

click_item.jpg

5. Extract data from the detail page

  • Select the data you want

  • Click Extract the text of the element or Extract the URL of the select image

6.gif
  • Double click the field name to rename it or click ... to delete fields

7.gif
  • Set up wait time for Extract Data action

wait_time.jpg

The auto-generated XPath of the data fields may fail to work after the web page updates. We will need to modify the XPath of the fields. In this case, we have prepared some useful XPath for this website.

  • Switch Data Preview to Vertical View

  • Double click on the XPath to modify it

  • Replace the XPath with the ones below

vertical.jpg

Product name: //h1

Price: //span[@itemprop="price"]

Product details: //h2[text()='Product details']/../following-sibling::div[1]

Specifications: //h2[text()='Specifications']/../following-sibling::div[1]


6. Run extraction - run your task and get data

  • Click Save

  • Click Run on the upper left side

  • Select Run task on your device to run the task on your computer

Note: Walmart tasks cannot be run in the Cloud due to CAPTCHA issues. You can only run it on your device for now.

12.png

Here is the sample output.

13.png
Did this answer your question?