How to Scrape Google Maps: A Comprehensive Guide [Link]
io/blog/how-to-scrape-google-maps
Sign up
Back to blog
Tutorials Data acquisition
How to Scrape Google Maps: A Comprehensive Guide
Danielius Radavicius
Share
2023-06-09 5 min read
1 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
In the current day and age, where public web data scraping has become a foundation Sign up for
many businesses, it’s unsurprising to see that Google Maps is yet another area commonly
scraped for its valuable data. In this article, we’ll discuss what this data may be and how to
build a scraper that gathers it using an Oxylabs solution.
Before we get started, let’s briefly look at the legalities of scraping Google Maps. The legality of
web scraping is a much-debated topic among everyone who works in the data-gathering
field. It’s important to note that web scraping may be legal in cases where it’s done without
breaching any laws regarding the source targets or data itself. That being said, we advise you
to seek legal consultation before engaging in scraping activities of any kind.
We’ve explored the legality of web scraping in this blog post, so feel free to check it out for a
more in-depth explanation.
Why scrape Google Maps?
The core purposes of scraping Google Maps are numerous. From a research perspective, a
user may want to employ a Google Maps data scraper to analyze demographic information
or transportation routes. For businesses, a Google Maps scraper may be the go-to tool for
competitor analysis, as it allows you to collect data on competitors' locations, customer reviews,
and ratings. Gathering real estate/property listings is a possible use case as well.
Overall, this makes Google Maps data scraping a highly lucrative solution that many
businesses are certain to make use of.
Should you use the official Google Maps API?
Quite a few popular websites like Twitter or Amazon provide their own APIs. Google is no
exception, therefore naturally the question arises, why not use the official Google Maps API?
Let’s begin with the price. Each user gets 200$ monthly credit for API calls. Within these 200$ are:
• Up to 40,000 Geolocation calls
• Up to 100,000 Static Maps loads
• Up to 28,000 Dynamic Maps loads
• Up to 40,000 Directions calls
At first glance, this may appear as plenty, but it’s likely not. Google's API, like many other APIs,
begins to charge you when the given amount is used up. Then, imagine a scenario where you
use the Embed API in Directions, Views, and Search modes. Suppose your service loads a map
that initiates address search through autocomplete. This singular request is now using up 2
different API calls. Add another requirement, say geolocation services for directions or
2 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
distances, and now a single request is taking up 3 separate API calls. Furthermore, as your
Sign up point,
business scales, so does the daily amount of calls you’ll make, meaning aer a certain
Google Maps API becomes an unbelievably pricey solution.
Yet, the high price isn’t the only limitation of Google’s own API. There are also strict request
limitations. Google’s current enforced rate limit is up to 100 requests per second.
Google is also known to implement unpredictable changes that offer little benefit to their users,
such as the limits imposed in 2010.
However, products like Oxylabs' Google Maps API solution are specifically made to avoid
limitations such as the ones mentioned above, which is why they’re commonly chosen instead
of official APIs.
How do I extract data from Google Maps?
Before you begin
To scrape Google Maps data, you will need Oxylabs' SERP Scraper API. Sign up for Google
Search Results API and take note of your username and password.
Replace USERNAME with your username and PASSWORD with your password throughout the
code samples in this guide.
Setting Up Your Project Environment
Before writing code to scrape data from Google Maps, we must set up a project environment
and install the necessary Python libraries.
Create a new virtual environment to separate your project dependencies from your system
packages. Ensure that you have Python 3.8 or newer installed. Run the following command in a
terminal:
1 $ python3 -m venv env
On Windows, use python instead of python3:
1 · Windows: env\Scripts\activate
2 · macOS∕Linux: source env∕bin∕activate
3 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
Install the required Python libraries for this project. We'll be using beautifulsoup4, requests, and
pandas. You can install them by running the following:
1 $ pip install beautifulsoup4 requests pandas
With your project environment set up, we're ready to start writing code to scrape Google Maps
data.
Fetching Data Using the Google Scraper API
We'll be using Oxylabs' Google Search API to fetch data from Google Maps. This API allows us
to send HTTP requests to Google and receive the HTML content of the search results page. For a
detailed tutorial, see How to Scrape Google Search Results.
1. First, open [Link] in your browser and search for "restaurants near me". You will see the
search results with the restaurants' names, ratings, hours, and other data points.
2. Copy this URL. We will use Google Search Scraper API to fetch data from this URL.
3. To use Google Search Results Scraper API, we need to set the following parameters:
• Source: This will be google .
• URL: The URL that you copied aer searching for restaurants near me.
• geo_location : Google Scraper API allows us to use any location for search
4. Create a dictionary as follows that will contain these parameters:
1 payload = {
2 "source": "google",
3 "url": f"https:∕∕[Link]∕search?tbs=lf:1,lf%5C_ui:9&tbm=lcl&q=r
4 "geo_location": "New York,New York,United States",
5 }
5. The next step is to send these parameters to the API endpoint. For this, we can use the
request library to send a POST message as follows:
4 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
1 response = [Link](
2 "POST",
3 "https:∕∕[Link]∕v1∕queries",
4 auth=("USERNAME", "PASSWORD"),
5 json=payload,
6 timeout=180,
7 )
Replace USERNAME and PASSWORD with your actual username and password.
6. If everything is well, you should get a response status code 200.
7. You can get the HTML from the results as follows:
1 html = [Link]().get("results")[0].get("content")
8. The next step is to parse this HTML.
Parsing Google Maps Data
Once we have the HTML content of the search results page, we can use the BeautifulSoup
library to parse the data. In this example, we'll extract the following data points from each
place listed in the search results—Name, Place Type, Address, Rating, Price Level, Rating Count,
Latitude, Longitude, Hours, and other details.
First, open the browser and open the same URL that you used in the code. Right-click on any of
the listings and select Inspect.
Try to create a selector that selects exactly one listing at a time.
5 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Signuse
One possible selector is [role='heading] . The other is [data-id] . We will up the
[data-id] in this example.
We can loop over all the matches and look for specific data points.
The next step is to create a CSS selector for each data point you want to scrape. For example,
you can select the name of the restaurant with the following CSS selector:
1 [role='heading']
The following are all the selectors:
1 name_selector = "[role='heading']"
2 type_selector = ".rllt__details div:nth-of-type(2)"
3 address_selector = ".rllt__details div:nth-of-type(3)"
4 hours_selectors = ".rllt__details div:nth-of-type(4)"
5 rating_count_selector = 'span:contains("(")'
6 rating_selector = "[aria-hidden='true']"
7 details_selector = ".rllt__details div:nth-of-type(5)"
8 price_selector = "span[aria-label*='xpensive']"
9 lat_selector = "[data-lat]"
10 lng_selector = "[data-lng]"
We can use BeautifulSoup's select and select_one methods to select elements and then
extract the text within those elements.
Rating count needs a different approach. The rating count is enclosed in brackets along with
the rating. For example, 4.3(513). In this case, the rating count is within the brackets.
In this case, we can use the regex to extract this value as follows:
1 count_match = [Link](r"\((.+)\)", rating_count_el.text)
2 rating_count = count_match.group(1) if count_match else ""
Putting everything together, the following code generates a list of dictionaries that contain all
6 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Putting everything together, the following code generates a list of dictionaries that contain all
the data from all the listings on the page:
Sign up
1 soup = BeautifulSoup(html, "[Link]")
2 data = []
3 for listing in [Link]("[data-id]"):
4 place = [Link]
5 name_el = place.select_one(name_selector)
6 name = name_el.[Link]() if name_el else ""
7
8 rating_el = place.select_one(rating_selector)
9 rating = rating_el.[Link]() if rating_el else ""
10
11 rating_count_el = place.select_one(rating_count_selector)
12 rating_count = ""
13 if rating_count_el:
14 count_match = [Link](r"\((.+)\)", rating_count_el.text)
15 rating_count = count_match.group(1) if count_match else ""
16
17 hours_el = place.select_one(hours_selectors)
18 hours = hours_el.[Link]() if hours_el else ""
19
20 details_el = place.select_one(details_selector)
21 details = details_el.[Link]() if details_el else ""
22
23 price_level_el = place.select_one(price_selector)
24 price_level = price_level_el.[Link]() if price_level_el else
25
26 lat_el = place.select_one(lat_selector)
27 lat = lat_el.get("data-lat") if lat_el else ""
28
29 lng_el = place.select_one(lng_selector)
30 lng = lng_el.get("data-lng") if lng_el else ""
31
32 type_el = place.select_one(type_selector)
33 place_type = type_el.[Link]().split("·")[-1] if type_el else
34
35 address_el = place.select_one(address_selector)
36 address = address_el.[Link]() if address_el else ""
37
38 place {
The next step is to save this data as CSV.
Exporting Google Maps Data to CSV
7 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
With the data parsed, the final step is to export it to a CSV file. We'll use the Pandas library to
Sign up
create a DataFrame and save it as a CSV file:
1 df = [Link](data)
2 df.to_csv("[Link]", index=False)
When you run this code, it will save the data to a CSV file named [Link] .
Conclusion
Scraping Google Maps isn’t an easy task, but this guide should help you navigate both how
the scraping process works and how it functions in tandem with our API solution. The aim of the
tutorial was to provide a step-by-step, comprehensive guide, but in case you have any
questions, don't hesitate to contact us or chat with our 24/7 available live support team.
8 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
About the author
Danielius Radavicius
Copywriter
Danielius Radavičius is a Copywriter at Oxylabs. Having grown up in films, music, and books and
having a keen interest in the defense industry, he decided to move his career toward tech-related
subjects and quickly became interested in all things technology. In his free time, you'll probably find
Danielius watching films, listening to music, and planning world domination.
Learn more about Danielius Radavicius
All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation
and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that
may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully
read the particular website's terms of service or receive a scraping license.
People also ask
Is it possible to collect data from Google Maps?
Yes, you can use various programming languages or automated solutions such as Google Maps data
extractor APIs to scrape Google Maps.
Is it legal to scrape Google Maps?
Why collect data Google Maps?
9 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
Related articles
Data acquisition Tutorials
How to Bypass CAPTCHA in Web
Scraping Using Python
Yelyzaveta Nechytailo
2023-10-03
Tutorials Scrapers
How to Make Web Scraping Faster
– Python Tutorial
Yelyzaveta Nechytailo
2023-03-29
Tutorials Scrapers
Puppeteer Tutorial: Scraping With
a Headless Browser
Gabija Fatenaite
2022-03-09
Get the latest news from
data gathering world
I’m interested
Scale up your business
with Oxylabs®
10 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Register Sign up
Contact sales
11 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
COMPANY PROXIES
About us Datacenter Proxies
Our values Shared Datacenter Proxies
Affiliate program Dedicated Datacenter Proxies
Service partners Residential Proxies
Press area Static Residential Proxies
Residential Proxies sourcing SOCKS5 Proxies
Careers Mobile Proxies
Our products Rotating ISP Proxies
OxyCon
Project 4beta
ADVANCED PROXY SOLUTIONS
Sustainability
Web Unblocker
Trust & Safety
SCRAPER APIS RESOURCES
SERP Scraper API Developers Hub
E-Commerce Scraper API FAQ
Web Scraper API Documentation
Blog
TOP LOCATIONS INNOVATION HUB
United States Adaptive Parser
United Kingdom Oxylabs' Patents
Canada
Germany
India
All locations
12 of 13 16-02-2024, 12:35 pm
How to Scrape Google Maps: A Comprehensive Guide [Link]
Sign up
GET IN TOUCH
General: hello@[Link]
Support: support@[Link]
Career: career@[Link]
Certified data centers and upstream providers
English
Connect with us
Privacy Policy KYC Policy
Vulnerability Disclosure Policy
Speak Up Code of Conduct
©
[Link] 2024 All Rights Reserved
13 of 13 16-02-2024, 12:35 pm