Data Engineer

Post Date

Jan 07, 2026

Location

New York,
New York

ZIP/Postal Code

10019
US
Jun 02, 2026 Insight Global

Job Type

Contract

Category

Architect (Engineering)

Req #

DGO-6b38c7e1-da38-4fb8-8511-9427a61810e0

Pay Rate

$51 - $64 (hourly estimate)

Job Description

Insight Global is seeking a fully remote hands‑on Data Engineer. This person will sit fully remote working East Coast hours for our client in New York City. This person will be responsible for building and owning the data foundation behind revenue, pricing, promotions, and inventory decisions for a ticketing program. You’ll design the data warehouse and pipelines that convert messy, scheduled reports and HTML/CSV files into clean, reliable datasets that power pricing strategy, sales analytics, and inventory visibility across shows and venues. This person with use analytics to turn raw sales signals into automated, trustworthy tables used by Tableau dashboards and decision models.

Responsibilities:
• Architect & implement the data platform: Stand up cloud data warehousing (preferably Google BigQuery) and define storage, partitioning, and modeling standards for sales, promotions, and inventory tables (e.g., star/snowflake schemas).
• Build ingestion & transformation pipelines: Create robust, scheduled ETL/ELT jobs that ingest data from Google Drive/CSV/HTML and other sources; normalize and enrich datasets; and publish curated marts for analytics and pricing.
• Automate manual processes: Replace ad‑hoc, manual pulls with reliable, monitored pipelines; implement job orchestration, alerting, and data‑quality checks (e.g., freshness, completeness, referential integrity).
• Enable pricing & promo strategy: Provide fast, accurate tables that support dynamic pricing, discounting, and campaign outcomes; surface inventory positions by show/date/section to guide strategy
• Partner with analytics & business users: Collaborate with revenue leaders and analysts using Tableau/Excel to define SLAs, data contracts, and semantic layers; deliver well‑documented datasets that are easy to consume.
• Productionize & operate: Own deployment, monitoring, and incident response for pipelines; optimize SQL and storage costs in BigQuery; continuously improve performance and reliability.
• Security & governance: Implement access controls, data lineage, and audit trails; establish naming conventions and versioning for transformations.


Compensation:
$60/hr to $70/hr.
Exact compensation may vary based on several factors, including skills, experience, and education.
Employees in this role will enjoy a comprehensive benefits package starting on day one of employment, including options for medical, dental, and vision insurance. Eligibility to enroll in the 401(k) retirement plan begins after 90 days of employment. Additionally, employees in this role will have access to paid sick leave and other paid time off benefits as required under the applicable law of the worksite location.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

• 5+ years in data engineering building production pipelines and warehouses at scale.
• Advanced SQL (window functions, CTEs, performance tuning) and practical Python (ETL/ELT, parsing HTML/CSV, API/file handling, testing).
• Proven experience with Google BigQuery (or equivalent columnar cloud warehouse) including partitioning, clustering, and cost/performance optimization.
• Experience ingesting non‑API data sources (scheduled reports, HTML/CSV files) and turning them into clean, reliable tables.
• Strong understanding of data modeling (star/snowflake), data quality (validation, reconciliation), and orchestration (e.g., Airflow/Cloud Composer or similar).
Ability to translate pricing/inventory business needs into scalable dataset designs; excellent documentation and stakeholder communication.

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.