Skip to main content
Axios is a promise-based HTTP client for Node.js and the browser. Integrating 2extract.com proxies is simple and requires passing a proxy configuration object with your request.

Basic Setup

To route your requests through our gateway, you need to provide the host, port, and auth details in the proxy object.
You’ll need to have axios installed in your project: npm install axios
basic_setup.js
const axios = require('axios');

// 1. Get these from your proxy's "Connection Details" page
const proxyHost = "proxy.2extract.net";
const proxyPort = 5555;
const proxyUser = "PROXY_USERNAME";
const proxyPass = "PROXY_PASSWORD";

// 2. Configure the Axios request
const axiosConfig = {
  // 3. Set up the proxy object
  proxy: {
    protocol: 'http',
    host: proxyHost,
    port: proxyPort,
    auth: {
      username: proxyUser,
      password: proxyPass
    }
  },
  timeout: 10000 // 10 seconds
};

// 4. Make your request!
async function checkIp() {
  try {
    const response = await axios.get("https://api.ipify.org?format=json", axiosConfig);
    console.log("Success! Your proxy IP is:", response.data.ip);
  } catch (error) {
    if (error.response) {
      console.error(`Error making request: Status ${error.response.status}`);
    } else {
      console.error(`Error making request: ${error.message}`);
    }
  }
}

checkIp();

Real-World Example: Scraping Reddit API for subreddit data

A common task is to collect structured data from public APIs. Let’s use Axios to scrape data from Reddit’s JSON API for a highly relevant subreddit: /r/webscraping. Reddit, like many sites, may serve different content or ads based on geography. We’ll check the “hot” posts from both the US and Great Britain (GB) to demonstrate this capability. This example shows how to create a reusable function to dynamically change the proxy’s country for each request.
reddit_scraper.js
const axios = require('axios');

// --- Your Base Credentials ---
const BASE_USERNAME = "PROXY_USERNAME";
const PASSWORD = "PROXY_PASSWORD";
const PROXY_HOST = "proxy.2extract.net";
const PROXY_PORT = 5555;

// --- Target Information ---
const SUBREDDIT = "webscraping";
const TARGET_URL = `https://www.reddit.com/r/${SUBREDDIT}/hot.json`;
const REGIONS = ["us", "gb"]; // USA and Great Britain

// --- Reusable Scraper Function ---
async function scrapeSubreddit(region) {
  console.log(`--- Scraping /r/${SUBREDDIT} from ${region.toUpperCase()} ---`);

  // Dynamically construct the username for the target region
  const proxyUsername = `${BASE_USERNAME}-country-${region}`;

  const axiosConfig = {
    proxy: {
      protocol: 'http',
      host: PROXY_HOST,
      port: PROXY_PORT,
      auth: {
        username: proxyUsername,
        password: PASSWORD
      }
    },
    // Reddit API requires a custom User-Agent
    headers: {
        'User-Agent': '2extract.com Docs Scraper/1.0'
    },
    timeout: 15000
  };

  try {
    const response = await axios.get(TARGET_URL, axiosConfig);

    // The actual posts are in response.data.data.children
    const posts = response.data.data.children;
    console.log(`Success! Found ${posts.length} posts.`);

    // Print the title of the first post
    if (posts.length > 0) {
        console.log(`Top post title: "${posts[0].data.title}"`);
    }

  } catch (error) {
    if (error.response) {
      // Handle errors from the target or the proxy
      console.error(`Request failed with status ${error.response.status}.`);
      // The X-2extract-Error header will be in error.response.headers
      if (error.response.headers['x-2extract-error']) {
          console.error(`--> Gateway Error: ${error.response.headers['x-2extract-error']}`);
      }
    } else {
      console.error(`An unexpected error occurred: ${error.message}`);
    }
  } finally {
      console.log("-" * (30 + SUBREDDIT.length + region.length));
  }
}

// --- Main Execution Logic ---
async function run() {
    for (const region of REGIONS) {
        await scrapeSubreddit(region);
        // Optional: add a small delay between requests to different regions
        await new Promise(resolve => setTimeout(resolve, 2000));
    }
}

run();

Expected Output

Running this script will produce an output similar to this:
Terminal
--- Scraping /r/webscraping from US ---
Success! Found 25 posts.
Top post title: "Looking for advice on scraping dynamic JS-heavy sites"
-----------------------------------------
--- Scraping /r/webscraping from GB ---
Success! Found 25 posts.
Top post title: "Looking for advice on scraping dynamic JS-heavy sites"
-----------------------------------------
(Note: Reddit’s content may not vary significantly between US/GB, but this demonstrates the technical capability) This example shows how to build a flexible scraper with Axios and 2extract.com, allowing you to reuse your scraping logic while dynamically changing proxy parameters for each targeted request.
I