Skip to main content

Web Scraping Book Information from a Website using BeautifulSoup and Pandas


Web scraping is a powerful technique to extract data from websites, and Python offers several libraries for this purpose. In this tutorial, we'll walk through a Python script that uses BeautifulSoup and Pandas to scrape book information from the 'https://books.toscrape.com/' website.

Step 1: Importing Libraries

We begin by importing the necessary libraries. BeautifulSoup is used for parsing HTML content, requests for making HTTP requests to the website, and Pandas for creating and manipulating data frames.


Step 2: Fetching Web Page Content

Next, we specify the URL of the website and use the requests library to fetch the HTML content of the page. We then decode the content to remove any encoding issues.



Step 3: Extracting Book Information

The book information is contained within <ol> (ordered list) tags on the webpage. We use BeautifulSoup to find all the <ol> tags.



Step 4: Creating a DataFrame

We define the column names for our data frame and create an empty data frame using Pandas.


Step 5: Looping Through Book Elements

We loop through each <ol> tag to find the <li> tags (list items) containing book information. Within each <li>, we locate the <article> tag that encompasses the book details.



Step 6: Extracting Book Details

Within the <article> tag, we locate the <h3> tag to get the book title and link. We also access the price and availability information.



Step 7: Populating the DataFrame

For each book, we insert a new row into the data frame.


Step 8: Exporting Data to CSV

Finally, we export the data frame to a CSV file.


That's it! You've successfully scraped book information from a website and stored it in a CSV file using Python. Feel free to customize the code for your specific needs or explore additional features provided by BeautifulSoup and Pandas. Happy coding!

For complete code : https://github.com/Aaminah27/Python-Scripts/blob/main/books_website_scraping.py

Comments

Popular posts from this blog

Creating a simple form with Flask-wtf

Tools : Visual Studio Technology : Python, Flask Assumptions : You already know the basics of flask, html and python.  Steps:  First of all install flask-wtf, a library for creating forms.  Type command "pip install flask-wtf" in the terminal and press enter. First of all create a secret key in your app.py file. This key is very necessary to operate the forms. You can assign any value to the key of your choice though. app.config['SECRET_KEY'] = 'dnbna' Once that is done, create a new html file, "forms.py" and import the required modules: Now create a class and initialize the variables, your forms.py will look something like this below: In the step above, we have created three variables for a Signup form and initialized them. Now to display these forms we need to have an html file. So create form.html and access the fields created above:  Now go to your main app.py file, where you create the routes for the application and import the signup form class...

Scraping through Multiple pages

  Step 1: Importing Libraries In the first step, we import the necessary libraries: BeautifulSoup, requests, and pandas. Step 2: Creating DataFrame Structure Here, we define the column names for our DataFrame and create an empty DataFrame. Step 3: Scraping Data from the Website We make an HTTP GET request to the URL, parse the HTML content using BeautifulSoup, and find all the <div> elements with the class 'p-4', which represent items on the webpage. Step 4: Iterating Through Items We loop through each item found on the webpage, extract the title and price of the item, and append them to our DataFrame. Step 5: Extracting Pagination Links We find the pagination section on the webpage and extract the page links. Step 6: Scraping Data from Multiple Pages We loop through each URL in the pagination links, form a new URL, send an HTTP GET request, and extract title and price data from each page. Step 7: Exporting Data to CSV Finally, we export the DataFrame to a CSV file named ...