Open Brewery DB Scraper avatar

Open Brewery DB Scraper

Pricing

from $8.00 / 1,000 results

Go to Apify Store
Open Brewery DB Scraper

Open Brewery DB Scraper

Scrape brewery listings across the United States. Get brewery names, types (micro, brewpub, large, contract), full addresses, GPS coordinates, phone numbers, and websites for 8,000+ breweries. Filter by state, city, brewery type, and postal code. Great for lead generation and market analysis.

Pricing

from $8.00 / 1,000 results

Rating

0.0

(0)

Developer

ParseForge

ParseForge

Maintained by Community

Actor stats

1

Bookmarked

2

Total users

1

Monthly active users

5 hours ago

Last modified

Share

ParseForge Banner

๐Ÿบ Open Brewery DB Scraper

๐Ÿš€ Collect structured brewery data with names, types, full addresses, phone numbers, website URLs, and GPS coordinates from the Open Brewery DB. Filter by state, city, type, and postal code.

๐Ÿ•’ Last updated: 2026-04-17

The Open Brewery DB is an open-source dataset of breweries, primarily covering the United States. This Actor queries that database and returns structured records with brewery names, types (micro, nano, regional, brewpub, large, and more), full street addresses, phone numbers, website URLs, and latitude/longitude coordinates. You can filter by state, city, brewery type, postal code, and sort order.

If you are building a brewery finder app, researching the craft beer market, planning brewery tours, or prospecting for distribution partnerships, this scraper collects the data you need in seconds. Set it on a monthly schedule to keep your brewery database current as new locations open and others close.

TargetOpen Brewery DB
Use CasesBrewery finder apps, market research, tour planning, distribution prospecting

๐Ÿ“‹ What it does

  • ๐Ÿญ Brewery listings. Names, types, and unique IDs for breweries across the United States.
  • ๐Ÿ“ Full addresses. Street addresses, cities, states, postal codes, and countries.
  • ๐Ÿ“ž Contact details. Phone numbers and website URLs for each brewery.
  • ๐Ÿ—บ๏ธ GPS coordinates. Latitude and longitude for mapping and geospatial analysis.
  • ๐Ÿ” Location and type filters. Search by state, city, postal code, and brewery type.

Each record includes the brewery ID, name, type, street address, city, state, postal code, country, phone number, website URL, latitude, longitude, and scrape timestamp.

๐Ÿ’ก Why it matters: Manually looking up brewery details one by one takes ages. This Actor collects hundreds of records with consistent formatting, complete with GPS coordinates for mapping, in a single run.


๐ŸŽฌ Full Demo

๐Ÿšง Coming soon: a 3-minute walkthrough showing how to go from sign-up to a downloaded dataset.


โš™๏ธ Input

InputTypeDefaultBehavior
statestring"California"Filter breweries by US state (e.g., "California", "New York", "Texas").
citystring-Optional city filter within the selected state.
postalCodestring-Filter by ZIP code (e.g., "44107" or "44107-9863").
breweryTypestring-Filter by type: micro, nano, regional, brewpub, large, planning, bar, contract, proprietor, or closed.
sortstring-Sort by: name, city, state, or type.
maxItemsinteger10Maximum breweries to return. Free users limited to 10. Paid users up to 1,000,000.

Example: Microbreweries in California sorted by name.

{
"state": "California",
"breweryType": "micro",
"sort": "name",
"maxItems": 50
}

Example: All breweries in Portland.

{
"state": "Oregon",
"city": "Portland",
"maxItems": 100
}

โš ๏ธ Good to Know: The Open Brewery DB is community-maintained and focuses primarily on US breweries. Some entries may have missing phone numbers or GPS coordinates. Schedule periodic runs to catch new additions.


๐Ÿ“Š Output

Each record contains 15+ fields. Download as CSV, Excel, JSON, or XML.

๐Ÿงพ Schema

FieldTypeExample
๐Ÿ“‹ namestring"Sierra Nevada Brewing Co"
๐Ÿท๏ธ breweryTypestring"regional"
๐Ÿ“ address1string"1075 E 20th St"
๐Ÿ™๏ธ citystring"Chico"
๐Ÿ—บ๏ธ stateProvincestring"California"
๐Ÿ“ฎ postalCodestring"95928-6722"
๐ŸŒ countrystring"United States"
๐Ÿ“ž phonestring"5308932520"
๐Ÿ”— websiteUrlstring"http://www.sierranevada.com"
๐Ÿ“ latitudestring"39.7242"
๐Ÿ“ longitudestring"-121.7867"
๐Ÿ†” idstring"sierra-nevada-brewing-co-chico"

๐Ÿ“ฆ Sample records


โœจ Why choose this Actor

Capability
๐Ÿญ10 brewery types. Micro, nano, regional, brewpub, large, planning, bar, contract, proprietor, and closed.
๐Ÿ—บ๏ธGPS coordinates. Latitude and longitude for mapping and geospatial work.
๐Ÿ“žContact details. Phone numbers and website URLs included.
๐Ÿ“Location filtering. Search by state, city, or postal code.
๐Ÿ“ŠSortable results. Order by name, city, state, or type.
๐Ÿ“…Schedule-ready. Set monthly runs to track new brewery openings.
๐Ÿ“ŠMultiple exports. Download as CSV, Excel, JSON, or XML.

The United States has over 9,000 craft breweries, and that number changes every month as new ones open and others close. Keeping a directory current requires regular data refreshes.


๐Ÿ“ˆ How it compares to alternatives

ApproachCostCoverageRefreshSetup
โญ Open Brewery DB Scraper (this Actor)$5 free credit, then pay-per-useUS breweries + some intlLive per runโšก 2 min
Manual web searchesFreeLimited by timeManual checks๐Ÿ• Hours
Open Brewery DB APIFreeSame datasetPer request๐Ÿ”ง 1-2 hours
Third-party brewery databases$50-200/moVariesMonthly๐Ÿ“‹ 30 min

Pick this Actor when you want structured brewery data with GPS coordinates without writing API code or managing pagination yourself.


๐Ÿš€ How to use

  1. ๐Ÿ“ Sign up. Create a free account with $5 credit (takes 2 minutes).
  2. ๐ŸŒ Open the Actor. Go to the Open Brewery DB Scraper page on the Apify Store.
  3. ๐ŸŽฏ Set input. Choose a state, city, brewery type, and set your max items.
  4. ๐Ÿš€ Run it. Click Start and let the Actor collect your data.
  5. ๐Ÿ“ฅ Download. Grab your results in the Dataset tab as CSV, Excel, JSON, or XML.

โฑ๏ธ Total time from signup to downloaded dataset: 3-5 minutes. No coding required.


๐Ÿ’ผ Business use cases

๐Ÿ—บ๏ธ App Development

  • Build brewery finder apps with GPS coordinates
  • Create interactive maps of craft breweries by region
  • Populate location databases for beer tourism apps
  • Add brewery contact data to business directories

๐Ÿ“Š Market Research

  • Analyze craft beer industry density by state and city
  • Track brewery types (micro vs. regional vs. brewpub)
  • Study geographic clustering of craft breweries
  • Compare brewery counts across metropolitan areas

๐Ÿš Tourism and Events

  • Plan brewery tour routes with addresses and coordinates
  • Build curated brewery guides for specific cities
  • Create event listings around brewery districts
  • Generate tour packages with contact details

๐Ÿ›’ Distribution and Sales

  • Identify breweries by region and type for prospecting
  • Build sales territories based on brewery density
  • Track new brewery openings for outreach
  • Segment by type for targeted marketing


๐ŸŒŸ Beyond business use cases

Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.

๐ŸŽ“ Research and academia

  • Empirical datasets for papers, thesis work, and coursework
  • Longitudinal studies tracking changes across snapshots
  • Reproducible research with cited, versioned data pulls
  • Classroom exercises on data analysis and ethical scraping

๐ŸŽจ Personal and creative

  • Side projects, portfolio demos, and indie app launches
  • Data visualizations, dashboards, and infographics
  • Content research for bloggers, YouTubers, and podcasters
  • Hobbyist collections and personal trackers

๐Ÿค Non-profit and civic

  • Transparency reporting and accountability projects
  • Advocacy campaigns backed by public-interest data
  • Community-run databases for local issues
  • Investigative journalism on public records

๐Ÿงช Experimentation

  • Prototype AI and machine-learning pipelines with real data
  • Validate product-market hypotheses before engineering spend
  • Train small domain-specific models on niche corpora
  • Test dashboard concepts with live input

๐Ÿค– Ask an AI assistant about this scraper

Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:

โ“ Frequently Asked Questions

๐Ÿ’ณ Do I need a paid Apify plan to run this actor?

No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.

๐Ÿšจ What happens if my run fails or returns no results?

Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.

๐Ÿ“ How many items can I scrape per run?

Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.

๐Ÿ•’ How fresh is the data?

Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.

๐Ÿง‘โ€๐Ÿ’ป Can I call this actor from my own code?

Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.

๐Ÿ“ค How do I export the data?

Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.

๐Ÿ“… Can I schedule the actor to run automatically?

Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.


๐Ÿ”Œ Automating Open Brewery DB Scraper

Control the scraper programmatically for scheduled runs and pipeline integrations:

  • ๐ŸŸข Node.js. Install the apify-client NPM package.
  • ๐Ÿ Python. Use the apify-client PyPI package.
  • ๐Ÿ“š See the Apify API documentation for full details.

The Apify Schedules feature lets you trigger this Actor on any cron interval. Run it monthly to track new brewery openings and closings in your target markets.

๐Ÿ”Œ Integrate with any app

Open Brewery DB Scraper connects to any cloud service via Apify integrations:

  • Make - Automate multi-step workflows
  • Zapier - Connect with 5,000+ apps
  • Slack - Get run notifications
  • Airbyte - Pipe data into your warehouse
  • GitHub - Trigger runs from commits
  • Google Drive - Export datasets straight to Sheets

You can also use webhooks to trigger downstream actions when a run finishes.


๐Ÿ’ก Pro Tip: browse the complete ParseForge collection for more data scrapers and tools.


๐Ÿ†˜ Need Help? Open our contact form to request a new scraper, propose a custom data project, or report an issue.


โš ๏ธ Disclaimer: this Actor is an independent tool and is not affiliated with, endorsed by, or sponsored by Open Brewery DB or its maintainers. All trademarks mentioned are the property of their respective owners. Only publicly available data is collected.