Open Brewery DB Scraper
Pricing
from $8.00 / 1,000 results
Open Brewery DB Scraper
Scrape brewery listings across the United States. Get brewery names, types (micro, brewpub, large, contract), full addresses, GPS coordinates, phone numbers, and websites for 8,000+ breweries. Filter by state, city, brewery type, and postal code. Great for lead generation and market analysis.
Pricing
from $8.00 / 1,000 results
Rating
0.0
(0)
Developer
ParseForge
Actor stats
1
Bookmarked
2
Total users
1
Monthly active users
5 hours ago
Last modified
Categories
Share

๐บ Open Brewery DB Scraper
๐ Collect structured brewery data with names, types, full addresses, phone numbers, website URLs, and GPS coordinates from the Open Brewery DB. Filter by state, city, type, and postal code.
๐ Last updated: 2026-04-17
The Open Brewery DB is an open-source dataset of breweries, primarily covering the United States. This Actor queries that database and returns structured records with brewery names, types (micro, nano, regional, brewpub, large, and more), full street addresses, phone numbers, website URLs, and latitude/longitude coordinates. You can filter by state, city, brewery type, postal code, and sort order.
If you are building a brewery finder app, researching the craft beer market, planning brewery tours, or prospecting for distribution partnerships, this scraper collects the data you need in seconds. Set it on a monthly schedule to keep your brewery database current as new locations open and others close.
| Target | Open Brewery DB |
|---|---|
| Use Cases | Brewery finder apps, market research, tour planning, distribution prospecting |
๐ What it does
- ๐ญ Brewery listings. Names, types, and unique IDs for breweries across the United States.
- ๐ Full addresses. Street addresses, cities, states, postal codes, and countries.
- ๐ Contact details. Phone numbers and website URLs for each brewery.
- ๐บ๏ธ GPS coordinates. Latitude and longitude for mapping and geospatial analysis.
- ๐ Location and type filters. Search by state, city, postal code, and brewery type.
Each record includes the brewery ID, name, type, street address, city, state, postal code, country, phone number, website URL, latitude, longitude, and scrape timestamp.
๐ก Why it matters: Manually looking up brewery details one by one takes ages. This Actor collects hundreds of records with consistent formatting, complete with GPS coordinates for mapping, in a single run.
๐ฌ Full Demo
๐ง Coming soon: a 3-minute walkthrough showing how to go from sign-up to a downloaded dataset.
โ๏ธ Input
| Input | Type | Default | Behavior |
|---|---|---|---|
state | string | "California" | Filter breweries by US state (e.g., "California", "New York", "Texas"). |
city | string | - | Optional city filter within the selected state. |
postalCode | string | - | Filter by ZIP code (e.g., "44107" or "44107-9863"). |
breweryType | string | - | Filter by type: micro, nano, regional, brewpub, large, planning, bar, contract, proprietor, or closed. |
sort | string | - | Sort by: name, city, state, or type. |
maxItems | integer | 10 | Maximum breweries to return. Free users limited to 10. Paid users up to 1,000,000. |
Example: Microbreweries in California sorted by name.
{"state": "California","breweryType": "micro","sort": "name","maxItems": 50}
Example: All breweries in Portland.
{"state": "Oregon","city": "Portland","maxItems": 100}
โ ๏ธ Good to Know: The Open Brewery DB is community-maintained and focuses primarily on US breweries. Some entries may have missing phone numbers or GPS coordinates. Schedule periodic runs to catch new additions.
๐ Output
Each record contains 15+ fields. Download as CSV, Excel, JSON, or XML.
๐งพ Schema
| Field | Type | Example |
|---|---|---|
๐ name | string | "Sierra Nevada Brewing Co" |
๐ท๏ธ breweryType | string | "regional" |
๐ address1 | string | "1075 E 20th St" |
๐๏ธ city | string | "Chico" |
๐บ๏ธ stateProvince | string | "California" |
๐ฎ postalCode | string | "95928-6722" |
๐ country | string | "United States" |
๐ phone | string | "5308932520" |
๐ websiteUrl | string | "http://www.sierranevada.com" |
๐ latitude | string | "39.7242" |
๐ longitude | string | "-121.7867" |
๐ id | string | "sierra-nevada-brewing-co-chico" |
๐ฆ Sample records
โจ Why choose this Actor
| Capability | |
|---|---|
| ๐ญ | 10 brewery types. Micro, nano, regional, brewpub, large, planning, bar, contract, proprietor, and closed. |
| ๐บ๏ธ | GPS coordinates. Latitude and longitude for mapping and geospatial work. |
| ๐ | Contact details. Phone numbers and website URLs included. |
| ๐ | Location filtering. Search by state, city, or postal code. |
| ๐ | Sortable results. Order by name, city, state, or type. |
| ๐ | Schedule-ready. Set monthly runs to track new brewery openings. |
| ๐ | Multiple exports. Download as CSV, Excel, JSON, or XML. |
The United States has over 9,000 craft breweries, and that number changes every month as new ones open and others close. Keeping a directory current requires regular data refreshes.
๐ How it compares to alternatives
| Approach | Cost | Coverage | Refresh | Setup |
|---|---|---|---|---|
| โญ Open Brewery DB Scraper (this Actor) | $5 free credit, then pay-per-use | US breweries + some intl | Live per run | โก 2 min |
| Manual web searches | Free | Limited by time | Manual checks | ๐ Hours |
| Open Brewery DB API | Free | Same dataset | Per request | ๐ง 1-2 hours |
| Third-party brewery databases | $50-200/mo | Varies | Monthly | ๐ 30 min |
Pick this Actor when you want structured brewery data with GPS coordinates without writing API code or managing pagination yourself.
๐ How to use
- ๐ Sign up. Create a free account with $5 credit (takes 2 minutes).
- ๐ Open the Actor. Go to the Open Brewery DB Scraper page on the Apify Store.
- ๐ฏ Set input. Choose a state, city, brewery type, and set your max items.
- ๐ Run it. Click Start and let the Actor collect your data.
- ๐ฅ Download. Grab your results in the Dataset tab as CSV, Excel, JSON, or XML.
โฑ๏ธ Total time from signup to downloaded dataset: 3-5 minutes. No coding required.
๐ผ Business use cases
๐ Beyond business use cases
Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.
๐ค Ask an AI assistant about this scraper
Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:
- ๐ฌ ChatGPT
- ๐ง Claude
- ๐ Perplexity
- ๐ Copilot
โ Frequently Asked Questions
๐ณ Do I need a paid Apify plan to run this actor?
No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.
๐จ What happens if my run fails or returns no results?
Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.
๐ How many items can I scrape per run?
Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.
๐ How fresh is the data?
Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.
๐งโ๐ป Can I call this actor from my own code?
Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.
๐ค How do I export the data?
Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.
๐ Can I schedule the actor to run automatically?
Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.
๐ Automating Open Brewery DB Scraper
Control the scraper programmatically for scheduled runs and pipeline integrations:
- ๐ข Node.js. Install the
apify-clientNPM package. - ๐ Python. Use the
apify-clientPyPI package. - ๐ See the Apify API documentation for full details.
The Apify Schedules feature lets you trigger this Actor on any cron interval. Run it monthly to track new brewery openings and closings in your target markets.
๐ Integrate with any app
Open Brewery DB Scraper connects to any cloud service via Apify integrations:
- Make - Automate multi-step workflows
- Zapier - Connect with 5,000+ apps
- Slack - Get run notifications
- Airbyte - Pipe data into your warehouse
- GitHub - Trigger runs from commits
- Google Drive - Export datasets straight to Sheets
You can also use webhooks to trigger downstream actions when a run finishes.
๐ Recommended Actors
- ๐ซ Greatschools Scraper - Collect school ratings and reviews by location
- ๐ REMAX Real Estate Scraper - Scrape real estate listings with addresses
- ๐ Smart Apify Actor Scraper - Scrape Apify Store actors with 70+ fields
- ๐ฐ PR Newswire Scraper - Collect press releases and corporate news
- ๐ง HTML to JSON Smart Parser - Parse any web page into structured JSON
๐ก Pro Tip: browse the complete ParseForge collection for more data scrapers and tools.
๐ Need Help? Open our contact form to request a new scraper, propose a custom data project, or report an issue.
โ ๏ธ Disclaimer: this Actor is an independent tool and is not affiliated with, endorsed by, or sponsored by Open Brewery DB or its maintainers. All trademarks mentioned are the property of their respective owners. Only publicly available data is collected.