Back to home
Outbound Feed

Push job data directly to your infrastructure.

A continuous pipeline that syncs job listings to your databases and search engines every 15 minutes—with incremental updates, deduplication, and automatic retries.

Pipeline Overview Live
Sources
GreenhouseLeverWorkdayAshbyiCIMSBambooHR+39
Jobo Feed Engine
Sync every 15 min · Upsert · Retry
Your infrastructure
PostgreSQLMongoDBElasticsearchAlgoliaMeilisearch
45+ATS sources
5destinations
10Kper sync

Three steps to continuous sync.

Choose a destination, configure connection credentials and data filters, activate the feed—and Jobo handles scheduling, retries, and deduplication for you.

01

Choose a destination.

Pick where you want job data delivered—PostgreSQL, MongoDB, Elasticsearch, Algolia, or Meilisearch.

Databases, search engines, and more
02

Configure & save.

Enter connection credentials and optional data filters. Your configuration is saved and can be edited anytime.

Filter by location, source, remote, date
03

Activate & sync.

Activate the feed to start pushing data. Jobo handles scheduling, retries, and deduplication—every 15 minutes, automatically.

Incremental updates, zero maintenance

Five destinations. Zero maintenance.

Push to databases and search engines used by thousands of engineering teams. Credentials are encrypted with AES-256 at rest—never stored in plain text.

PostgreSQL

Database

Sync jobs into structured tables with full SQL query support. SSL and automatic table creation.

hostportdatabaseschemausernamepasswordssl_mode

MongoDB

Database

Push job listings as rich JSON documents. Supports MongoDB Atlas and self-hosted instances.

connection_stringdatabaseauth_source

Elasticsearch

Search Engine

Index job listings for full-text search and analytics. Basic auth, API key, and unauthenticated.

endpointauth_methodusernamepasswordapi_key_idapi_key_secret

Algolia

Search Engine
Popular

Instant search with typo tolerance, faceting, and geo-search. Auto-creates the index.

application_idapi_keyindex_name

Meilisearch

Search Engine
Popular

Lightning-fast open-source search. Works with cloud-hosted and self-hosted instances.

hostapi_keyindex_name

Don't see your destination? Request one

Configuration

Connect in minutes.

Each destination is configured through the dashboard with validated fields. Credentials are encrypted before storage—never visible in logs.

PostgreSQL fields

hoststringrequired

Database server hostname or IP address.

portnumberrequired

Connection port. Defaults to 5432.

databasestringrequired

Target database name.

schemastring

Target schema. Defaults to "public".

usernamestringrequired

Database user with write access.

passwordstringrequired

User password. Encrypted at rest with AES-256.

ssl_modeselect

SSL mode: disable, allow, prefer, require, verify-ca, verify-full.

PostgreSQL config
{
  "host": "db.example.com",
  "port": 5432,
  "database": "jobs_db",
  "schema": "public",
  "username": "jobo_sync",
  "password": "••••••••",
  "ssl_mode": "require"
}
Sync Engine

Built for reliability.

The sync engine runs on a 15-minute interval with incremental updates, automatic retries, and deduplication—so your data stays fresh without intervention.

Incremental sync

Each cycle only processes new and updated records. No full re-syncs unless you switch destinations or activate fresh.

Deduplication

Jobs are deduplicated by unique ID. Existing records are updated in place (upsert)—no duplicates, no stale data.

Automatic retries

Failed syncs retry with exponential backoff—1 min, 5 min, 15 min, 1 hour. You're notified of persistent failures.

Full backfill

Activate or switch destinations and a full backfill runs. May take several minutes depending on data volume.

Expiration handling

Expired jobs aren't removed automatically. Use the Expired Jobs API to detect stale records and clean up.

Schema mapping

Every destination gets the same 22-field schema. Records map automatically to tables, documents, or search indices.

Data Filters

Push only the data you need.

Each configuration includes optional filters so only relevant jobs are synced. Narrow by location, ATS source, remote status, or posting date.

Filter parameters

locationsobject[]

Filter by location. Each object supports country, region, and city. Multiple locations use OR matching.

sourcesstring[]

Filter by ATS source (e.g., "greenhouse", "lever"). All sources included by default.

isRemoteboolean

When enabled, only remote positions are synced.

postedAfterdate

Only sync jobs posted after this date (ISO 8601).

Example filters
{
  "locations": [
    { "country": "US", "region": "California" },
    { "country": "GB", "city": "London" }
  ],
  "sources": ["greenhouse", "lever", "workday"],
  "isRemote": true,
  "postedAfter": "2025-01-01"
}
Data Schema

22 fields per record.

Every job record pushed to your destination follows a consistent schema—structured for search, analytics, and integration with your existing systems.

Identity

idstring
titlestring
company_namestring
company_idstring

Content

descriptionstring
listing_urlstring
apply_urlstring

Location

citystring
statestring
countrystring
is_remoteboolean

Compensation

salary_minnumber
salary_maxnumber
salary_currencystring
salary_periodstring

Classification

employment_typestring
workplace_typestring
experience_levelstring
sourcestring

Timestamps

date_posteddatetime
created_atdatetime
updated_atdatetime
Sample record
{
  "id": "abc123-def456",
  "title": "Senior Frontend Engineer",
  "company_name": "Acme Corp",
  "company_id": "acme-corp",
  "description": "We're looking for a Senior Frontend Engineer...",
  "listing_url": "https://boards.greenhouse.io/acme/jobs/123",
  "apply_url": "https://boards.greenhouse.io/acme/jobs/123#app",
  "city": "San Francisco",
  "state": "California",
  "country": "US",
  "salary_min": 150000,
  "salary_max": 200000,
  "salary_currency": "USD",
  "salary_period": "Year",
  "employment_type": "Full-time",
  "workplace_type": "Hybrid",
  "experience_level": "Senior",
  "is_remote": false,
  "source": "greenhouse",
  "date_posted": "2025-02-10T08:00:00Z",
  "created_at": "2025-02-10T09:15:00Z",
  "updated_at": "2025-02-12T14:30:00Z"
}
Security

Enterprise-grade protection.

Credentials are encrypted before storage and data transfers use TLS 1.3. Your connection details never appear in logs or error reports.

AES-256 at rest

All credentials are encrypted with AES-256 before storage. Never stored in plain text or included in logs.

TLS 1.3 in transit

All data transfers between Jobo and your destination use TLS 1.3 with the strongest available cipher suites.

Minimal permissions

We recommend a dedicated user with write-only access to the target table, index, or collection.

Limits

Quotas & rate limits.

Designed for high-volume production use with generous defaults and automatic retry handling.

Active feeds per account1
Saved configurationsUnlimited
Sync frequencyEvery 15 minutes
Max batch size10,000 records per sync
Retry attempts4 (exponential backoff)
Supported destinations5 (more coming)

45+ ATS sources.

Filter your outbound feed by any combination of applicant tracking systems. All sources are included by default.

ADPADP
ApplicantProApplicantPro
AshbyAshby
BambooHRBambooHR
BreezyBreezy
CareerPlugCareerPlug
CareerPuckCareerPuck
ComeetComeet
CornerstoneCornerstone
DayforceDayforce
EightfoldEightfold
FreshteamFreshteam
GemGem
GoHireGoHire
GreenhouseGreenhouse
HireHiveHireHive
HiringThingHiringThing
HomerunHomerun
iCIMSiCIMS
iSolvediSolved
JazzHRJazzHR
JobScoreJobScore
JobviteJobvite
JOINJOIN
KulaKula
LeverLever
Oracle CloudOracle Cloud
PaycomPaycom
PersonioPersonio
PinpointPinpoint
PolymerPolymer
RecootyRecooty
RecruiteeRecruitee
RipplingRippling
SmartRecruitersSmartRecruiters
SuccessFactorsSuccessFactors
TaleoTaleo
TalNetTalNet
TeamtailorTeamtailor
TrakstarTrakstar
UltiProUltiPro
WorkableWorkable
WorkdayWorkday
ZohoZoho

Frequently asked.

Common questions about the outbound feed, destination configuration, and sync behavior.

Can I have multiple active feeds?

Currently, only one feed can be active at a time. You can save multiple configurations and switch between them instantly. Switching triggers a full backfill to the new destination.

What happens when I switch destinations?

The previous feed is deactivated and the new one is activated. The new destination receives a full backfill on first sync. Data in the previous destination is not deleted.

Are expired jobs removed from my destination?

No. The outbound feed only pushes new and updated jobs. To handle expirations, use the Expired Jobs API endpoint to get recently expired job IDs and remove them from your destination.

How do I test my connection before activating?

When you save a configuration, the connection is validated automatically. If it fails, you'll see an error with details. You can save without activating to test later.

What's the difference between Outbound Feed and Export?

Export is a one-time download in CSV, JSON, or Parquet. Outbound Feed is a continuous pipeline that keeps your destination in sync every 15 minutes. Use Export for ad-hoc analysis; use Outbound Feed for production integrations.

Do I need a subscription?

Activating a feed requires an active Jobs Data subscription. You can configure, save, and test destinations without one—activation is the only step that requires a plan.

Ready to pipe your job data?

Configure your first destination in minutes. No code, no cron jobs, no maintenance.