Software Development Engineer (Java /Big Data / AI)

Post Date

Sep 10, 2025

Location

Dunwoody,
Georgia

ZIP/Postal Code

30346
US
Nov 13, 2025 Insight Global

Job Type

Contract

Category

Software Engineering

Req #

ATL-6615d4db-31bb-428a-b198-431b55198501

Pay Rate

$46 - $57 (hourly estimate)

Job Description

One of our Retail Clients are looking for a highly skilled and motivated Software Development Engineer II to join their AI-focused engineering team. This role is ideal for a Java developer with strong Big Data experience and a foundational understanding of data engineering principles. You will be instrumental in designing and building scalable, high-performance solutions that power our AI-driven platforms.

Pay Rate: $55-57/HR - Negotiable based on experience

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

Bachelor’s degree in Computer Science, Engineering, or a related field.
6+ years of hands-on experience in Java development.
Proven experience working with Big Data technologies (e.g., Spark, Hive, Hadoop, Kafka).
Familiarity with data engineering concepts such as ETL, data lakes, and cloud-based storage.
Experience with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes).
Exposure to AI/ML frameworks or projects (e.g., OpenAI, LangChain, TensorFlow, PyTorch).
Experience with CI/CD tools like Jenkins, Maven, or Gradle.
Knowledge of NoSQL databases and data caching strategies.
Familiarity with API gateways and secure integration practices

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.