What You'll Build
An AI-powered web scraper that extracts product data from e-commerce sites and stores it in MongoDB with automatic schema validation and data analysis.
Before You Start
Ensure you have these requirements ready:Node.js 16+
Required runtime environment
MongoDB
Local install or MongoDB Atlas
API Access
Browserbase API key and project ID
Step 1: Project Setup
Clone and Install
What gets installed?
What gets installed?
Core packages:
@browserbasehq/stagehand
- AI-powered web scrapingmongodb
- MongoDB driver for data storagezod
- Schema validation for type safety
chalk
&boxen
- Terminal styling and output formattingplaywright
- Browser automation engine
Step 2: Start MongoDB
Required: MongoDB must be running on your system before proceeding. Start it with
mongod
if installed locally, or ensure your MongoDB Atlas connection is ready.MongoDB Atlas users can skip this step as the database is already hosted in the cloud.
Step 3: Configuration
Environment Variables
Create your.env
file with the required configuration:
Step 4: Configure Stagehand
The integration is configured to use Browserbase cloud browsers:stagehand.config.ts
Step 5: Run Your First Scrape
What happens when you run the scraper:- Connects to MongoDB and creates necessary collections
- Navigates to Amazon laptop category
- Scrapes product listings with AI-powered extraction
- Extracts detailed information for the first 3 products
- Stores all data in MongoDB with schema validation
- Runs analysis queries and displays results
Execute the Scraper
Customization Options
Extend Data Schema
Add custom fields to capture more product information:Custom Extraction Instructions
Modify the AI extraction to capture specific data:What’s Next?
Now that you have a working MongoDB + Stagehand integration:Scale Your Scraping
Learn how to scale your scraping operations across multiple sites and handle larger datasets.
Deploy to Production
Deploy your scraping pipeline to production with Browserbase.
Need help? Join the Stagehand Slack community for support and to share your scraping projects!