Skip to main content

Creating a simple form with Flask-wtf



Tools: Visual Studio

Technology : Python, Flask

Assumptions: You already know the basics of flask, html and python.

 Steps:

  •  First of all install flask-wtf, a library for creating forms. 
  • Type command "pip install flask-wtf" in the terminal and press enter.
  • First of all create a secret key in your app.py file. This key is very necessary to operate the forms. You can assign any value to the key of your choice though.
    • app.config['SECRET_KEY'] = 'dnbna'
  • Once that is done, create a new html file, "forms.py" and import the required modules:


  • Now create a class and initialize the variables, your forms.py will look something like this below:


  • In the step above, we have created three variables for a Signup form and initialized them.
  • Now to display these forms we need to have an html file. So create form.html and access the fields created above: 


  • Now go to your main app.py file, where you create the routes for the application and import the signup form class by:
    • from forms import SignupForm
  • Once you have done this create a signup route, access the form object and pass it to the form.html page. 


RESULT:
This is all, now run the flask application and enter "localhost:5000/signup" in the browser. You will see a form like this below:

Right now the form does nothing, it was just to show you how to create a very simple form using flask-wtf library. 

Thanks to https://www.youtube.com/@TheCodex for helping me to learn these concepts.


Comments

Popular posts from this blog

Web Scraping Book Information from a Website using BeautifulSoup and Pandas

Web scraping is a powerful technique to extract data from websites, and Python offers several libraries for this purpose. In this tutorial, we'll walk through a Python script that uses BeautifulSoup and Pandas to scrape book information from the 'https://books.toscrape.com/' website. Step 1: Importing Libraries We begin by importing the necessary libraries. BeautifulSoup is used for parsing HTML content, requests for making HTTP requests to the website, and Pandas for creating and manipulating data frames. Step 2: Fetching Web Page Content Next, we specify the URL of the website and use the requests library to fetch the HTML content of the page. We then decode the content to remove any encoding issues. Step 3: Extracting Book Information The book information is contained within <ol> (ordered list) tags on the webpage. We use BeautifulSoup to find all the <ol> tags. Step 4: Creating a DataFrame We define the column names for our data frame and create an empty dat...

Scraping through Multiple pages

  Step 1: Importing Libraries In the first step, we import the necessary libraries: BeautifulSoup, requests, and pandas. Step 2: Creating DataFrame Structure Here, we define the column names for our DataFrame and create an empty DataFrame. Step 3: Scraping Data from the Website We make an HTTP GET request to the URL, parse the HTML content using BeautifulSoup, and find all the <div> elements with the class 'p-4', which represent items on the webpage. Step 4: Iterating Through Items We loop through each item found on the webpage, extract the title and price of the item, and append them to our DataFrame. Step 5: Extracting Pagination Links We find the pagination section on the webpage and extract the page links. Step 6: Scraping Data from Multiple Pages We loop through each URL in the pagination links, form a new URL, send an HTTP GET request, and extract title and price data from each page. Step 7: Exporting Data to CSV Finally, we export the DataFrame to a CSV file named ...