software developement

Building A Web Crawler Using Octoparse

Welcome friends, remember the write up on the top twenty web scraping tools? Octoparse made the list as one of the most powerful tools.

Recently, I picked up the tool and I was impressed with how much stuff Octoparse allows the users do. In this article, you’ll see what Octoparse is about, an introduction to it’s built-in scraper and also how you can build your own scraper from scratch.

Octoparse is a tool used in scraping data from websites. It is an easy to use web crawler application to fetch data without having to write any additional line of code.

Octoparse is not complicated to use, and in just three steps, you can do great stuff with this powerful web crawling tool. All you require is the URL you need to extract data from and a couple of clicks.

It does not have any limitation as to what kind of website it can scrape data from. Also, exporting data is made easier in form of a CSV file or an API.

You can take advantage of Octoparse features. Some of them are:

  • It lets you build web crawlers fast without writing a line of code
  • It provides a cloud service for scheduled data extraction and IP rotation
  • It offers unlimited storage
  • It allows you hire professional data scraping experts from Octoparse to do the job for you

With this, you have a solid concept as to what Octoparse is, its purpose and how to get started with it.

Getting Started With Octoparse

Before building our first web crawler, let’s set up our environment for development. We start by downloading Octoparse from their official website. I recommend you download the Octoparse 7.1 version.

Why Octoparse 7.1?

Octoparse 7.1 comes with features you won’t find on older versions to the tool:

  • Task templates which aid with predefined templates when scraping data from websites such as Amazon or eBay.
  • The dashboard has a structured new look which provides more information to the user.
  • Ability to scrape data from multiple URLs by importing them from an excel sheet, CSV or text file.
  • An anti-blocking feature to bypass protections that prevent users from scraping data from a website.

You can download the Octoparse version 7.1 executable. It only works on Windows operating systems, so you’ll need the VirtualBox to run on your Linux machine. Octoparse provides a guide on using the tool for users of Linux machines.

Introduction To Task Template

Task template is a feature introduced into the latest version of Octoparse, designed to make web scraping easier for everybody regardless of technical knowledge.

How To Use Task Template

To save you the time, there is really no lengthy process towards using task templates. However, some data are required, which includes the target URL, keywords to search for and many more parameters you need to extract the required data of your choice from the website.

Octoparse already has some built-in templates when you need to scrape data from them, most of which include Google, Amazon, eBay and Walmart amongst others. Let’s try to use one of the built-in task templates.

You start off by selecting a template of your choice, in this case, let’s use the eBay task template. After selecting the template, you will be prompted to input your parameters based on the needed data. These parameters are target URL or a keyword to search for.

Within our parameter box, input “Nike shoes as the keyword. With this, Octoparse does the rest of the task by fetching all data based on your parameters, in this case, all Nike shoes. This data is ready to be utilized for whatever purpose you have in mind.

For further analysis on your scraped data, navigate to the data field tab of your task template to view extra information on all contents on the web page, which includes Nike shoe images, the seller name, the price and number of inventory.

You can also navigate to the sample output tab to view information about the data such as product name, product URL and many more data virtually related to all Nike shoes on eBay.

You’ve seen how easy it is to scrape data with task template. Play around with the task template and scrape data from eBay. Try out other built-in task templates such as Walmart or Google with Octoparse.

Building A Web Crawler With Octoparse

You’ve come this far to build a web crawler with Octoparse. You do have a piece of foundational knowledge and all there is to know about in scraping data from a website with the use of a task template. However, you can build a web crawler yourself.

In building a web crawler with Octoparse, there are two approaches. They are:

  • Wizard Mode
  • Advanced Mode

Building A Web Crawler With Octoparse Wizard Mode

The Wizard Mode approach is actually an easier and faster way to scrape data from a website. With a smooth step by step interface, you can have your web crawler up and running in no time. However, you are advised to use Advanced Mode for more complex data scraping.

With Wizard Mode, you can scrape data from tables, links or items in pages. Limited to the scope of this tutorial, you’ll learn to build a web crawler for a single web page.

To begin with, launch your Octoparse application and create a new task from the Wizard Mode and enter the URL you would like to scrape data from. You can rename the Group input field to anything that seems cool to you and click the next button.

You will be navigated to a new page to select extraction type, and since you are working on scraping data from a single web page, you’ll the single page. With your extraction data type very much defined, you can now define our fields.

To define your fields, you select the target data from the single web page and once you do, it auto-fills the data into the fields, now you can edit the fields property into whatever you like, and you can add more data by clicking the add more fields button.

By following these steps, you will be able to extract data from a single web page in less than five minutes.

Building A Web Crawler With Octoparse Advanced Mode

The Wizard Mode can be used in scraping simple websites with easy structure, but websites designed with more complex structures will be a tougher task. The Advanced Mode is the tool you’ll use to scrape such websites.

Go ahead and launch your Octoparse application, under the Advanced Mode, create a new task and enter the URL you’ll like to scrape data from and hit the save button. This navigates you to the task configuration workflow.

The task configuration workflow interface gives you more flexibility towards how you would want to extract data. The predefining workflow feature is turned off by default, so turn it on to get started with it.

In Advanced Mode, when you select data on the webpage, you are provided with action tips to perform for the selected data.

From the webpage you want to crawl data from, when you click on an item, you’ll see the action tips at the bottom right of the page. The action tips allow you select what you want to do such as extracting data.

With Advanced Mode, you can spend most of your time creating your workflow on how to extract data and once you are past this stage, your task workflow will be ready for use. Simply click on the start extraction button for Octoparse to work according to your workflow.

Working with Advanced Mode might seem a bit difficult to comprehend for first timers, but you’ll become more comfortable with it over time.

Conclusion

You can scrape websites by writing code for web scrapers, but this can be time consuming. Octoparse gives you great results, without you writing code or spending time working on the scraper logic.

In this article, you’ve seen what Octoparse is about, how it saves you time and effort. You’ve also seen how you can make use of the built-in task templates to scrape data from certain websites, and also build your own powerful web scrapers.

Octoparse is currently available only as a Windows executable, so you’ll need the VirtualBox to use it on your Linux machine.

You can visit the Octoparse official website to know more about the Advanced Mode and Wizard Mode so you can web scrape a lot of websites.

About the author

Habeeb Kenny Shopeju

I love building software, very proficient with Python and JavaScript. I'm very comfortable with the linux terminal and interested in machine learning. In my spare time, I write prose, poetry and tech articles.