A top client based in Grand Rapids, MI is looking to bring an Enterprise GCP Big Data Architect onto the team. You will be spearheaded the company's greenfield GCP platform development and building out teams of Data Engineers that will be reporting into yourself. It is expected for you to hit the ground running to validate the teams' current processes and data architecture. You will be very hands-on with the platform development, leading teams of contractors, & actively shaping the future of data ecosystems.
Other Responsibilities Will Include:
-Responsible for ideation, architecture, design and development of new enterprise data platform.
-Architecting and designing core components with a microservices architecture, abstracting platform, and infrastructure intricacies.
-Creating and maintaining essential data platform SDKs and libraries, adhering to industry best practices.
-Designing and developing connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.
-Designing and optimizing data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.
-Architecting and designing the best security patterns and practices.
-Designing and developing data quality frameworks and processes to ensure the accuracy and reliability of data.
-Collaborating with data scientists, analysts, and cross-functional teams to design data models, database schemas and data storage solutions.
-Designing and developing advanced analytics and machine learning capabilities on the data platform.
-Designing and developing observability and data governance frameworks and practices.
-Staying up to date with the latest data engineering trends, technologies, and best practices.
-Driving the deployment and release cycles, ensuring a robust and scalable platform.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to
HR@insightglobal.com.
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
https://insightglobal.com/workforce-privacy-policy/ .
-Strong hands-on experience in GCP (data flow, data flex, air flow, data prop, etc.)
-Proficient in building enterprise end-to-end data platforms and data services in GCP
-Data migration to BigQuery
-Experience with microservices (Docker, Kubernetes. TypeScript & Node.JS-based servers)
-Experience in Data Lakehouse architectures
-Experience writing spark queries using Python
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.