Find Your Perfect Job

Job Search Results for big data

Sort and Filter  | 15 Results for big data  | Save This Search

Nov 15, 2023

Alpharetta, GA

|

Data Warehousing

|

Contract

{"JobID":322985,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-84.2538181818182,"Longitude":34.0800909090909,"Distance":null},"State":"Georgia","Zip":"30005","ReferenceID":"ATL-664713","PostedDate":"\/Date(1700036176000)\/","Description":"A client of Insight Global is looking for a Big Data Developer with Azure, Java, and SQL skills to join their linking team. The linking team is responsible for linking billions of public and proprietary records to create a unique entity identifier for consumers, organizations and providers. We\u0027re looking for an experienced, smart, driven individual who will help us build and deliver enterprise-wide big data linking solutions by resolving complex analytical problems using quantitative approaches with a unique blend of analytical, mathematical and technical skills. This includes analyzing and linking large data problems using various statistical techniques, developing data models, implementing solutions, and providing support. This position provides assistance and input to management, develops in large multifunctional development activities, solves complex technical problems, writes complex code for computer systems, and serves as a senior source of expertise. The ideal candidate will have 5+ years of data engineering experience with excellent Java and SQL skills, deploying applications into Azure interfacing with CosmosDB. It would be considered a nice to have to have a background in mathematics, statistical analysis, and data science.","Title":"Big Data Engineer/Scientist","City":"Alpharetta","ExpirationDate":null,"PriorityOrder":0,"Requirements":"*Bachelor or Master\u0027s degree in computer science, mathematics, statistics, or equivalent technical discipline *5+ years of data engineering experience working with big data analytics and numerous large data sets/data warehouses *Ability to assemble, analyze, and evaluate big data and be able to make appropriate and well-reasoned recommendations to stakeholders *Good analytical and problem solving skills, good understanding of different data structures, algorithms and their usage in solving business problems *Experience with Search algorithms and/or preparing data for Search applications *Ability to collaborate and lead internal and external technology resources in solving complex business needs *Strong communication (verbal and written) and customer service skills. Strong interpersonal, communication, and presentation skills applicable to a wide audience including senior and executive management, customers, etc. *Experience with Big data technologies, *Knowledge of Data science and NLP *Understanding of precision and recall in data science *Experience with Data Mining *Familiarity with clustering and linking of data *Exposure Text Mining and Natural Language Processing (NLP) *Expertise in development languages Java and SQL *Experience with cloud technologies -- Azure, Databricks, Cosmos DB *Deploying apps into Azure and interfacing with CosmoDB","Skills":"*Experience with managing a team or ambition to do that would be an advantage *Relevant industry experience: healthcare, insurance, finance, etc. *Certifications in data science/NLP/Azure *Python *Other cloud experience *Mathematics background *Georgia Tech education","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":74.1000,"SalaryLow":49.4000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

A client of Insight Global is looking for a Big Data Developer with Azure, Java, and SQL skills to join their linking team. The linking team is responsible for linking billions of public and... proprietary records to create a unique entity identifier for consumers, organizations and providers. We're looking for an experienced, smart, driven individual who will help us build and deliver enterprise-wide big data linking solutions by resolving complex analytical problems using quantitative approaches with a unique blend of analytical, mathematical and technical skills. This includes analyzing and linking large data problems using various statistical techniques, developing data models, implementing solutions, and providing support. This position provides assistance and input to management, develops in large multifunctional development activities, solves complex technical problems, writes complex code for computer systems, and serves as a senior source of expertise. The ideal candidate will have 5+ years of data engineering experience with excellent Java and SQL skills, deploying applications into Azure interfacing with CosmosDB. It would be considered a nice to have to have a background in mathematics, statistical analysis, and data science.

Nov 17, 2023

Orlando, FL

|

Database Administrator (DBA)

|

Perm

{"JobID":323933,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-81.3691818181818,"Longitude":28.5191818181818,"Distance":null},"State":"Florida","Zip":"32835","ReferenceID":"ORL-665432","PostedDate":"\/Date(1700208995000)\/","Description":"Big Data Principle Product Manager will be joining the technical product management team to provide expertise building the outstanding technology software and products. The focus of this role will focus on Data and Portfolio projects. They will ultimately be responsible for defining, building and supervising product capabilities to address identified market and business needs or customer requests. This is both a highly strategic role and a tactical role requiring the ability to accomplish goals through influence and leadership.You are responsible for driving company success through performing the following tasks to the highest standards:Create and maintain product and feature requirements reflecting technical product roadmap. Develop new technology product and features that are aligned with the company technology, strategic direction, and business imperatives.Take product initiatives work with architecture, user experience and various engineering teams to define product releases and features and supervise end to end software product development and release process.Work closely with internal business collaborators such as Product Strategy Marketing, Sales and other business unites around launch readiness process. Deliver product launch awareness training, review product documentation, help with questions from business customer facing team related to technology and software launches.","Title":"Principle Product Manager - Big Data","City":"Orlando","ExpirationDate":null,"PriorityOrder":0,"Requirements":"Must Haves:-10 years experience -Road mapping launched big data and portfolio management applications -Agile preferred and Water knowledge -Cloud Azure Devops -Confluence (Wiki)Plusses: -AHA!","Skills":"","Industry":"Database Administrator (DBA)","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":201000.0000,"SalaryLow":134000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Big Data Principle Product Manager will be joining the technical product management team to provide expertise building the outstanding technology software and products. The focus of this role will... focus on Data and Portfolio projects. They will ultimately be responsible for defining, building and supervising product capabilities to address identified market and business needs or customer requests. This is both a highly strategic role and a tactical role requiring the ability to accomplish goals through influence and leadership.You are responsible for driving company success through performing the following tasks to the highest standards:Create and maintain product and feature requirements reflecting technical product roadmap. Develop new technology product and features that are aligned with the company technology, strategic direction, and business imperatives.Take product initiatives work with architecture, user experience and various engineering teams to define product releases and features and supervise end to end software product development and release process.Work closely with internal business collaborators such as Product Strategy Marketing, Sales and other business unites around launch readiness process. Deliver product launch awareness training, review product documentation, help with questions from business customer facing team related to technology and software launches.

Oct 09, 2023

Bellevue, WA

|

Data Warehousing

|

Contract

|

$75 - $113 (hourly estimate)

{"JobID":315842,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-122.126454545455,"Longitude":47.5626363636364,"Distance":null},"State":"Washington","Zip":"98006","ReferenceID":"SEA-656527","PostedDate":"\/Date(1696882575000)\/","Description":"An employer is looking for a Data Architect in the Bellevue, WA area. As a Data Architect, you will architect, design, and build enterprise class data centric solutions for this enterprise wide Business to Business group. Your success will be measured on your ability to rapidly develop a keen understanding of Information architecture while working closely with other architecture and development teams to build world class solutions. This position will work very closely Leadership, Principal Architects and Development team members to build out and execute the Data strategy for B2B Systems. Requires competency in customer focus, change \u0026 innovation, strategic thinking, relationship building \u0026 influencing, and results focus.","Title":"Sr. Data Architect","City":"Bellevue","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 10+ Years of experience in enterprise level data focused environments - 5+ Years of experience in a data architecture role- Hands on experience is a must for this role- Experience providing end to end Data Warehouse, Business Intelligence, and/or Big Data Architecture Solutions - Experience with Teradata, Informatica, SQL, or Oracle - Experience with Cloud, Specifically in Azure - Experience with Data Flows, Data Models, Data Mapping, Data quality, and Data Tables - Strong presentation skills, customer service skills, and experience creating and driving proof of concepts- Experience in fast paced development environments, preferably in Agile environments","Skills":"","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":113.1000,"SalaryLow":75.4000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

An employer is looking for a Data Architect in the Bellevue, WA area. As a Data Architect, you will architect, design, and build enterprise class data centric solutions for this enterprise wide... Business to Business group. Your success will be measured on your ability to rapidly develop a keen understanding of Information architecture while working closely with other architecture and development teams to build world class solutions. This position will work very closely Leadership, Principal Architects and Development team members to build out and execute the Data strategy for B2B Systems. Requires competency in customer focus, change & innovation, strategic thinking, relationship building & influencing, and results focus.

Oct 31, 2023

Alpharetta, GA

|

Data Warehousing

|

Contract,Perm Possible

{"JobID":320291,"JobType":["Contract,Perm Possible"],"EmployerID":null,"Location":{"Latitude":-84.2543636363636,"Longitude":34.063,"Distance":null},"State":"Georgia","Zip":"30005","ReferenceID":"ATL-661370","PostedDate":"\/Date(1698740189000)\/","Description":"Day-to-Day: *Design, develop and maintain high throughput, precise, ultra-low latency industry leading big data linking solutions *Generate and own end-to-end solutions for loosely defined business problems by leveraging pattern detection over potentially large datasets *Provide project management and technical leadership for every aspect of SDLC *Coordinate and collaborate with technology and business stakeholders, including gathering and determining feasibility of customer requirements *Recommend technical strategy, participate in evolution of architecture, design solutions, and build and integrate the implementation *Lead, plan and implement core changes that impact enterprise-wide critical functions/processes. *Work in iterative processes to perform advanced data analysis, validate findings or test hypotheses, and communicate results and methodology *Communicate technical information successfully with technical and non-technical audiences such as third-party vendors, external customer technical departments, various levels of management and other relevant parties *Collaborate effectively with all team members as well as hold regular team meetingsSummary:A client of Insight Global is looking for a Principal Software/Data Engineer focused on Big Data Analytics to join their Linking Team. The linking team is responsible for linking billions of public and proprietary records to create a unique entity identifier for consumers, organizations and providers. We\u0027re looking for an experienced, smart, driven individual who will help us build and deliver enterprise-wide big data linking solutions by resolving complex analytical problems using quantitative approaches with a unique blend of analytical, mathematical and technical skills. This includes analyzing and linking large data problems using various statistical techniques, developing data models, implementing solutions, and providing support. This position provides assistance and input to management, develops and leads large multifunctional development activities, solves complex technical problems, writes complex code for computer systems, and serves as a senior source of expertise. The position may also provide sizing or budget recommendations to management. This role is intended to become a Tech Lead upon conversion working on big data cloud projects and it will be essential to have strong communication skills for success in that position. The ideal candidate will have 8+ years of data engineering experience with excellent Java and SQL skills, deploying applications into Azure interfacing with CosmosDB.","Title":"Principal Software/Data Engineer","City":"Alpharetta","ExpirationDate":null,"PriorityOrder":0,"Requirements":"Must Haves: *Bachelor or Master\u0027s degree in computer science, mathematics, statistics, or equivalent technical discipline *8+ years of data engineering experience working with big data analytics and numerous large data sets/data warehouses *Ability to assemble, analyze, and evaluate big data and be able to make appropriate and well-reasoned recommendations to stakeholders *Good analytical and problem solving skills, good understanding of different data structures, algorithms and their usage in solving business problems *Experience with Search algorithms and/or preparing data for Search applications *Ability to collaborate and lead internal and external technology resources in solving complex business needs *Strong communication (verbal and written) and customer service skills. Strong interpersonal, communication, and presentation skills applicable to a wide audience including senior and executive management, customers, etc. *Experience with Big data technologies, *Knowledge of Data science and NLP *Understanding of precision and recall in data science *Experience with Data Mining *Familiarity with clustering and linking of data *Exposure Text Mining and Natural Language Processing (NLP) *Expertise in development languages Java and SQL *Experience with cloud technologies -- Azure, Databricks, Cosmos DB *Deploying apps into Azure and interfacing with CosmoDB","Skills":"*Experience with managing a team or ambition to do that would be an advantage *Relevant industry experience: healthcare, insurance, finance, etc. *Certifications in data science or analytics *Python","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":81.9000,"SalaryLow":54.6000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Day-to-Day: *Design, develop and maintain high throughput, precise, ultra-low latency industry leading big data linking solutions *Generate and own end-to-end solutions for loosely defined... business problems by leveraging pattern detection over potentially large datasets *Provide project management and technical leadership for every aspect of SDLC *Coordinate and collaborate with technology and business stakeholders, including gathering and determining feasibility of customer requirements *Recommend technical strategy, participate in evolution of architecture, design solutions, and build and integrate the implementation *Lead, plan and implement core changes that impact enterprise-wide critical functions/processes. *Work in iterative processes to perform advanced data analysis, validate findings or test hypotheses, and communicate results and methodology *Communicate technical information successfully with technical and non-technical audiences such as third-party vendors, external customer technical departments, various levels of management and other relevant parties *Collaborate effectively with all team members as well as hold regular team meetingsSummary:A client of Insight Global is looking for a Principal Software/Data Engineer focused on Big Data Analytics to join their Linking Team. The linking team is responsible for linking billions of public and proprietary records to create a unique entity identifier for consumers, organizations and providers. We're looking for an experienced, smart, driven individual who will help us build and deliver enterprise-wide big data linking solutions by resolving complex analytical problems using quantitative approaches with a unique blend of analytical, mathematical and technical skills. This includes analyzing and linking large data problems using various statistical techniques, developing data models, implementing solutions, and providing support. This position provides assistance and input to management, develops and leads large multifunctional development activities, solves complex technical problems, writes complex code for computer systems, and serves as a senior source of expertise. The position may also provide sizing or budget recommendations to management. This role is intended to become a Tech Lead upon conversion working on big data cloud projects and it will be essential to have strong communication skills for success in that position. The ideal candidate will have 8+ years of data engineering experience with excellent Java and SQL skills, deploying applications into Azure interfacing with CosmosDB.

Nov 20, 2023

Tampa, FL

|

Data Warehousing

|

Contract-to-perm

{"JobID":324328,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-82.4683636363636,"Longitude":27.9557272727273,"Distance":null},"State":"Florida","Zip":"33634","ReferenceID":"TPA-665910","PostedDate":"\/Date(1700500581000)\/","Description":"A client in the Tampa Bay area is looking for a Data Scientist to joining their team. This person will be involved in the design and development efforts for our big data solutions including data lake, Business Intelligence Solutions, Machine Learning, Data Pipeline, and cloud-based data warehouse products. You must have direct experience in the design and implementation in areas such as machine learning, artificial intelligence, operational research, or statistical methods. You will Acts as a key contributor to all phases of the design and development lifecycle of analytic applications utilizing various technology platforms","Title":"Data Scientist","City":"Tampa","ExpirationDate":null,"PriorityOrder":0,"Requirements":"? Power BI with ML integration? Working in Azure environments including Azure ML ? Familiarity with DevOps and CI/CD as well as Agile tools and processes including Git, and Azure DevOps? Demonstrated experience in machine learning techniques, probabilistic reasoning, data science, and/or optimization? Proven ability in creating explainable models and implementing advanced algorithms into production? Experience in implementing supervised and unsupervised machine learning techniques, analysis of variance (ANOVA) and statistical significance test? Demonstrated programming experience in Python, R, Keras, TensorFlow with .NET integration? Experience developing/implementing analytic solutions in the Azure cloud that leverage relational, in-memory, NoSQL, document and/or graph databases? Experience designing and implementing data pipeline for analytical model consumption of structured, semi-structured, and/or unstructured data in batch and real-time environments? Experience with APIs, Web Services? Good understanding of dimensional data modeling, data transformation \u0026 designing analytical data structures? Good SQL skills, broad exposure to all language constructs? Data Integration / ETL / ELT tools","Skills":"","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":101.4000,"SalaryLow":67.6000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

A client in the Tampa Bay area is looking for a Data Scientist to joining their team. This person will be involved in the design and development efforts for our big data solutions including data... lake, Business Intelligence Solutions, Machine Learning, Data Pipeline, and cloud-based data warehouse products. You must have direct experience in the design and implementation in areas such as machine learning, artificial intelligence, operational research, or statistical methods. You will Acts as a key contributor to all phases of the design and development lifecycle of analytic applications utilizing various technology platforms

Nov 07, 2023

Durham, NC

|

Data Warehousing

|

Perm

{"JobID":321569,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-78.8917272727273,"Longitude":36.0016363636364,"Distance":null},"State":"North Carolina","Zip":"27709","ReferenceID":"RAL-663016","PostedDate":"\/Date(1699355778000)\/","Description":"Job Description: This position reports to the Data Partnerships Director of Data and Analytics Platforms. This position will be responsible for the management and execution of the deidentication processes applied to the data assets to be included in the Federated Clinical Applications Platform (FCAP), and the management and administration of the FCAP deidentification cloud environment . The position will be a member of the Data Parternships Data Engineering team and will additionally provide expertise in the development of data integration and delivery pipelines to deliver new data modalities into the FCAP and Data Lake. These solutions will capitalize on technologies to improve the value of analytical data, improve effectiveness of information stewardship, and streamline the flow of data in the organization. Solutions will focus on using state of the art data and analytics tools including traditional and near real-time data warehousing, big-data, relational and document based databases using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will be comfortable with data science platforms with proven experience leveraging DevOps and Automation/Orchestration tools. Job Responsibilities: *Create and follow defined procedures in the deidentification of patient medical information *Maintain and tune the deidentification environment to perform optimally and comply with policies and standards *Collaborate with partners on improving the deidentication programs and processes, and work with the partner and the Cloud Team on troubleshooting issues *Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. *Recommend design of analytics solutions which improves data integration, data quality, and data delivery with an eye towards re-useable components *Create and maintain an optimal data pipeline architecture","Title":"Deidentification Data Engineer","City":"Durham","ExpirationDate":null,"PriorityOrder":0,"Requirements":"Must Haves: *5+ years of experience in a hands on Data Engineer role *Bachelor\u0027s degree in a related field *Experience with relational SQL and NoSQL databases *Writing and executing Python programs and shell scripts on Linux *Linux administration experience *Data Engineering on Microsoft Azure *Experience with data pipeline and orchestration tools such as Azure Data Factory and SQL Server Integration Services *Developing on cloud-based analytic platforms such as Azure Synapse","Skills":"Plusses: *Prior experience in health care IT *Working knowledge of Azure DevOps \u0026 Automation/Orchestration *Knowledge of open source software solutions and open source as a business model *Technical breadth across application development, enterprise architecture, or application integration *Understanding of Agile methodology *Knowledge of APIs, API Integration, and API Management","Industry":"Data Warehousing","Country":"US","Division":"Government","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":135000.0000,"SalaryLow":90000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Job Description: This position reports to the Data Partnerships Director of Data and Analytics Platforms. This position will be responsible for the management and execution of the deidentication... processes applied to the data assets to be included in the Federated Clinical Applications Platform (FCAP), and the management and administration of the FCAP deidentification cloud environment . The position will be a member of the Data Parternships Data Engineering team and will additionally provide expertise in the development of data integration and delivery pipelines to deliver new data modalities into the FCAP and Data Lake. These solutions will capitalize on technologies to improve the value of analytical data, improve effectiveness of information stewardship, and streamline the flow of data in the organization. Solutions will focus on using state of the art data and analytics tools including traditional and near real-time data warehousing, big-data, relational and document based databases using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will be comfortable with data science platforms with proven experience leveraging DevOps and Automation/Orchestration tools. Job Responsibilities: *Create and follow defined procedures in the deidentification of patient medical information *Maintain and tune the deidentification environment to perform optimally and comply with policies and standards *Collaborate with partners on improving the deidentication programs and processes, and work with the partner and the Cloud Team on troubleshooting issues *Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. *Recommend design of analytics solutions which improves data integration, data quality, and data delivery with an eye towards re-useable components *Create and maintain an optimal data pipeline architecture

Sep 19, 2023

Dublin, CA

|

Data Warehousing

|

Contract-to-perm

|

$60 - $90 (hourly estimate)

{"JobID":312018,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-121.896,"Longitude":37.7116363636364,"Distance":null},"State":"California","Zip":"94568","ReferenceID":"SFR-652193","PostedDate":"\/Date(1695150987000)\/","Description":"Insight Global is looking for a Data Engineer to support one of our largest clients in the retail industry. They will be responsible for designing and modeling data engineering pipelines that support enterprise reporting an analytic needs. They will engineer efficient, adaptable, scaleable data pipelines for moving data from different sources into their Cloud Lakehouse. They will understand and analyze business requirements and translate into well0architected solutions that demonstrate the modern BI \u0026 Analytics platform. In addition, they will be a part of data modernization projects providing direction on matters of overall design and technical direction, acts as the primary driver toward establishing guidelines and approaches. Other responsibilities include driving timely and proactive issue identification, escalation, and resolutions, and collaborating effectively within Data Tech teams, BI teams to design and build optimized data flows from source to Data visualization.","Title":"Data Engineer","City":"Dublin","ExpirationDate":null,"PriorityOrder":0,"Requirements":"*8 + years in-depth, data engineering experience and execution of data pipelines, data ops, scripting and SQL queries *5+ years proven data modeling skills - must have demonstrable experience designing models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models) *Expert in Python for data analytics, visualization, machine learning, and models *2-3 years\u0027 experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc. Experience with Snowflake and other Cloud Data Warehousing / Data Lake preferred *Expert in engineering data pipelines using various data technologies -- ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience *Hands on data warehouse design, development, and data modeling best practices for modern data architectures *Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow. Hands on DevOps/Data Ops experience required","Skills":"*Knowledge/working experience in reporting tools such as MicroStrategy, Power BI would be a plus *Experience with Streamsets, dbt preferred *Retail Domain experience","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":89.7000,"SalaryLow":59.8000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking for a Data Engineer to support one of our largest clients in the retail industry. They will be responsible for designing and modeling data engineering pipelines that support... enterprise reporting an analytic needs. They will engineer efficient, adaptable, scaleable data pipelines for moving data from different sources into their Cloud Lakehouse. They will understand and analyze business requirements and translate into well0architected solutions that demonstrate the modern BI & Analytics platform. In addition, they will be a part of data modernization projects providing direction on matters of overall design and technical direction, acts as the primary driver toward establishing guidelines and approaches. Other responsibilities include driving timely and proactive issue identification, escalation, and resolutions, and collaborating effectively within Data Tech teams, BI teams to design and build optimized data flows from source to Data visualization.

Sep 05, 2023

San Diego, CA

|

Architect

|

Perm

|

$71k - $107k (estimate)

{"JobID":309289,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-117.110090909091,"Longitude":33.0389090909091,"Distance":null},"State":"California","Zip":"92127","ReferenceID":"SDG-648865","PostedDate":"\/Date(1693955785000)\/","Description":"An enterprise retail client is seeking a Mexico-based Remote Data Architect to join their team. This Architect will be apart of the integration of data platforms which handles customer data. This Architect will need to integrate and manage everything related to the customer data on a brand new platform and existing MDM platform. This platform and application will be heavily used across the enterprise. This candidate will be involved in requirement analysis, creating the design, etc. They will be doing 75% requirements analysis/design and 25% hands on coding.","Title":"Remote Data Architect- INTL MEXICO","City":"San Diego","ExpirationDate":null,"PriorityOrder":0,"Requirements":"3+ years developing Big Data Solutions (architectures, deployments, and operations - including design and estimation)Experience with API strategy2+ years of experience within AWS environment5+ years of experience in Architecting and Designing MDM solutions using Relatio,Stibo or another customer data platform3+ years experience in Architecting and Designing CDI platformsExpertise in data models, data pipeline concepts, and cloud-based infrastructure disciplines","Skills":"Experience with Kafka, snowflake and airflow","Industry":"Architect","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":106800.0000,"SalaryLow":71200.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

An enterprise retail client is seeking a Mexico-based Remote Data Architect to join their team. This Architect will be apart of the integration of data platforms which handles customer data. This... Architect will need to integrate and manage everything related to the customer data on a brand new platform and existing MDM platform. This platform and application will be heavily used across the enterprise. This candidate will be involved in requirement analysis, creating the design, etc. They will be doing 75% requirements analysis/design and 25% hands on coding.

Oct 21, 2022

Saint Louis, MO

|

Programmer / Developer

|

Contract-to-perm

{"JobID":231039,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-90.2169090909091,"Longitude":38.6428181818182,"Distance":null},"State":"Missouri","Zip":"63108","ReferenceID":"CLT-580761","PostedDate":"\/Date(1666376210000)\/","Description":"Insight Global is looking for Senior Hadoop Developers to support Data and Analytics Platform, Information Management and Solution Delivery for one of our largest Financial clients in either Dallas TX, Charlotte NC, Pennington NJ, or New York. The role ensures design and engineering approach for complex data solutions is consistent across multiple flows and systems, while building processes to support data transformation, data structures, metadata, data quality controls, dependency and workload management. The individual will be responsible to define internal controls, identify gaps in data management standards adherence and work with appropriate partners to develop plans to close the same, lead concept and experimentation testing to synthesize the results and validate and improve the solution, document and communicate required information for deployment, maintenance, support, and business functionality. They may be required to mentor more junior Data Engineers and coach team members in delivery/release activities.","Title":"Hadoop Developer (ToD)","City":"Saint Louis","ExpirationDate":null,"PriorityOrder":0,"Requirements":"* 3-6+ years experience in Hadoop stack and storage technologies (HDFS, MapReduce, Yarn, HIVE, Sqoop, Impala , spark, flume, Kafka and Oozie) * 1-- 3+ years experience in Scala programming for Big Data * Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred) * Experienced in HBase, RDBMS, SQL, ETL and data analysis * Experience in No SQL Technologies (ex., Cassandra/ MongoDB, etc.) * Experienced in scripting(Unix/Linux) and scheduling (Autosys) * Experience with team delivery/release processes and cadence pertaining to code deployment and release * Experience in Hive Tuning, Bucketing and Partitioning, debugging, tracing errors/exception in Spark job execution * Knowledgeable in techniques for designing Hadoop based file layout, optimized for business","Skills":"* Object-oriented programming and design experience * Experience with automated testing methodologies and frameworks, including JUnit, is a plus * Exposure to JAVA/Spring framework * Fundamentals of Python -- Data Structures, Collections, Pandas for file and other type of data handling, visualizations etc. * Visual Analytics Tools knowledge ( Tableau ) * Degree in Computer Science or equivalent * Any Big Data certification(ex. Cloudera\u0027s CCP, CCA) is a plus * Excellent analytical capabilities - Strong interest in algorithms. Research oriented, motivated, pro-active, self-starter with good interpersonal skills * Experience with team delivery/release processes and cadence pertaining to code deployment and release * A team player with good verbal and written skills, capable of working with a team of Architects, Developers, Business/Data Analysts, QA and client stakeholders * Proficient understanding of distributed computing principles * Python IDEs(Django, Flask), data wrangling and analytics in a python based environment * Experience with Big Data Analytics \u0026 Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python ) * Data Integration, Data Security on Hadoop ecosystem. ( Kerberos ) * Research oriented, motivated, pro-active, self-starter with strong technical, analytical and interpersonal skills * Versatile resource with balanced development skills and business acumen to operate at a fast and accurate speed","Industry":"Programmer / Developer","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":98.1360,"SalaryLow":65.4240,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking for Senior Hadoop Developers to support Data and Analytics Platform, Information Management and Solution Delivery for one of our largest Financial clients in either Dallas... TX, Charlotte NC, Pennington NJ, or New York. The role ensures design and engineering approach for complex data solutions is consistent across multiple flows and systems, while building processes to support data transformation, data structures, metadata, data quality controls, dependency and workload management. The individual will be responsible to define internal controls, identify gaps in data management standards adherence and work with appropriate partners to develop plans to close the same, lead concept and experimentation testing to synthesize the results and validate and improve the solution, document and communicate required information for deployment, maintenance, support, and business functionality. They may be required to mentor more junior Data Engineers and coach team members in delivery/release activities.

Nov 28, 2023

Oldsmar, FL

|

Software Engineering

|

Contract

{"JobID":325705,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-82.6557272727273,"Longitude":28.0451818181818,"Distance":null},"State":"Florida","Zip":"34677","ReferenceID":"TPA-666916","PostedDate":"\/Date(1701191773000)\/","Description":"Insight Global is seeking a strong developers to join a team of our one of our large Media clients remotely. As a Developer, you will be responsible for the development of client software applications. This software is used across the globe and consists of very complex data structures and algorithms. You will be working in an Agile environment and will be expected to test the functionality of any features you work on, as well as to write unit/ automated tests. The ideal candidate will demonstrate effective communication skills.","Title":"INTL (INDIA/AUS) - Sr. Developer","City":"Oldsmar","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- Strong experience in csharp/.NET (4.6+)- Either WCF or WPF experience (both preferred but just one is acceptable) - ASP.NET experience- Experience with MS SQL Server- Proficient in writing high performance/ efficient code- Strong understanding of data structures and algorithms - Effective communication skills (English)","Skills":"- Big Data experience - Experience in Media domain","Industry":"Software Engineering","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":27.3000,"SalaryLow":18.2000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is seeking a strong developers to join a team of our one of our large Media clients remotely. As a Developer, you will be responsible for the development of client software... applications. This software is used across the globe and consists of very complex data structures and algorithms. You will be working in an Agile environment and will be expected to test the functionality of any features you work on, as well as to write unit/ automated tests. The ideal candidate will demonstrate effective communication skills.

1 - 10 of 15