REMOTE AWS Big Data Engineer

Post Date

Oct 05, 2023

Location

Atlanta,
Georgia

ZIP/Postal Code

30309
US
Sep 14, 2025 Insight Global

Job Type

Contract,Perm Possible

Category

Computer Engineering

Req #

ATL-651835

Pay Rate

$60 - $90 (hourly estimate)

Job Description

A client of Insight Global is currently seeking an experienced Data Engineer -- Big Data individual. The successful candidate must have Big Data engineering experience and must demonstrate an affinity for working with others to create successful solutions. Join a smart, highly skilled team with a passion for technology, where you will work on our state of the art Big Data Platforms. They must be a very good communicator, both written and verbal, and have some experience working with business areas to translate their business data needs and data questions into project requirements. The candidate will participate in all phases of the Data Engineering life cycle and will independently and collaboratively write project requirements, architect solutions and perform data ingestion development and support duties.

Required Skills & Experience

* 6+ years of overall IT experience

* 3+ years of experience with high-velocity high-volume stream processing: Apache Kafka and Spark Streaming

o Experience with real-time data processing and streaming techniques using Spark structured streaming and Kafka

o Deep knowledge of troubleshooting and tuning Spark applications

* 3+ years of experience with data ingestion from Message Queues (Tibco, IBM, etc.) and different file formats across different platforms like JSON, XML, CSV

* 3+ years of experience with Big Data tools/technologies like Hadoop, Spark, Spark SQL, Kafka, Sqoop, Hive, S3, HDFS, or

* 3+ years of experience building, testing, and optimizing 'Big Data' data ingestion pipelines, architectures, and data sets

* 2+ years of experience with Python (and/or Scala) and PySpark/Scala-Spark

* 3+ years of experience with Cloud platforms e.g. AWS, GCP, etc.

* 3+ years of experience with database solutions like Kudu/Impala, or Delta Lake or Snowflake or BigQuery

* 2+ years of experience with NoSQL databases, including HBASE and/or Cassandra

* Experience in successfully building and deploying a new data platform on Azure/ AWS

* Experience in Azure / AWS Serverless technologies, like, S3, Kinesis/MSK, lambda, and Glue

* Strong knowledge of Messaging Platforms like Kafka, Amazon MSK & TIBCO EMS or IBM MQ Series

* Experience with Databricks UI, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL, Delta Live Tables, Unity Catalog

* Knowledge of Unix/Linux platform and shell scripting is a must

* Strong analytical and problem-solving skills

Nice to Have Skills & Experience

* Strong SQL skills with ability to write intermediate complexity queries

* Strong understanding of Relational & Dimensional modeling

* Experience with GIT code versioning software

* Experience with REST API and Web Services

* Good business analyst and requirements gathering/writing skills

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.