It is a form of copying in which specific data is gathered and copied from the web, typically into a central local database or spreadsheet, for later retrieval or analysis.
The whole process is carried out by a piece of code which is called a “scraper”. First, it sends a “GET” query to a specific website. Then, it creates an HTML document based on the results. After completing this step, the scraper searches for the data needed within the document and lastly converts it into whatever specified format.