Find Your Perfect Job

Job Search Results for big data

Sort and Filter  | 18 Results for big data  | Save This Search

Feb 06, 2024

Irving, TX

|

Help Desk

|

Contract,Perm Possible

|

$43 - $64 (hourly estimate)

{"JobID":337270,"JobType":["Contract,Perm Possible"],"EmployerID":null,"Location":{"Latitude":-96.9357272727273,"Longitude":32.8724545454545,"Distance":null},"State":"Texas","Zip":"75039","ReferenceID":"DAL-680748","PostedDate":"\/Date(1707246995000)\/","Description":"As a Support Engineer, you will use advanced troubleshooting methods and tools to solve technically complex problems. These highly complex problems require broad, in-depth product knowledge and may include support of additional product lines. We\u0027ll provide you with abundant resources, including a rich content library, advanced diagnostic tools, and access to other Microsoft experts.Key Responsibilities:As a Support Engineer, you will represent Microsoft and communicate with corporate customers via telephone, written correspondence, or electronic service regarding escalated problems in Microsoft software products and manage relationships with those customers.It\u0027s your chance to: * Demonstrate strong interpersonal and communication skills, while working with diverse audiences including highly technical IT professionals, developers, architects, and executive management. * Exhibit leadership through personal responsibility, accountability and teamwork. * Act as a technical focal point in cooperative relationships with other companies. * Manage crisis situations that may involve technically challenging issues and diverse audiences. * Be responsive to customer needs which may sometimes require outside of normal business hours or on-call rotation. * Use trace analysis and other proprietary tools to analyze problems and develop solutions to meet customer needs.","Title":"Azure Big Data Support Engineer","City":"Irving","ExpirationDate":null,"PriorityOrder":0,"Requirements":"3+ years technical support or technical consulting experience, preferably at an enterprise level Strong experience with at least one of the following: Azure Databricks, Azure Synapse, or HD Insight Experience scripting in Python Strong experience with SparkBasic Azure networking","Skills":"Experience in any RDBMS or an understanding of general RDBMS concepts would be considered a plus.Azure data engineering, Azure networking certification","Industry":"Help Desk","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":63.9600,"SalaryLow":42.6400,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

As a Support Engineer, you will use advanced troubleshooting methods and tools to solve technically complex problems. These highly complex problems require broad, in-depth product knowledge and may... include support of additional product lines. We'll provide you with abundant resources, including a rich content library, advanced diagnostic tools, and access to other Microsoft experts.Key Responsibilities:As a Support Engineer, you will represent Microsoft and communicate with corporate customers via telephone, written correspondence, or electronic service regarding escalated problems in Microsoft software products and manage relationships with those customers.It's your chance to: * Demonstrate strong interpersonal and communication skills, while working with diverse audiences including highly technical IT professionals, developers, architects, and executive management. * Exhibit leadership through personal responsibility, accountability and teamwork. * Act as a technical focal point in cooperative relationships with other companies. * Manage crisis situations that may involve technically challenging issues and diverse audiences. * Be responsive to customer needs which may sometimes require outside of normal business hours or on-call rotation. * Use trace analysis and other proprietary tools to analyze problems and develop solutions to meet customer needs.

Feb 21, 2024

Mountain View, CA

|

Data Warehousing

|

Contract

|

$50 - $76 (hourly estimate)

{"JobID":340440,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-122.037272727273,"Longitude":37.428,"Distance":null},"State":"California","Zip":"94043","ReferenceID":"SJC-684480","PostedDate":"\/Date(1708521404000)\/","Description":"Insight Global is looking for a Data Analysis/Data Engineer to join a growing team for a large scale ISP company. This is a chance to join the NetEng organization within the capacity team. You will be working on a ground up project, playing a major role in bringing the right data to light for the team. This is a chance to use not only your data analytics skills, but also grow your virtualization skill sets and really use your data verification skills. The data for this team will be connected to OSS and BSS systems and a combination of network metrics.","Title":"Data Analyst/Data Engineer","City":"Mountain View","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 5+yrs in a Data Analytics/ Data Engineer role- Understanding Dashboarding and Data Visualization- Coming from a large data set background (large quantities of data) - Any Big data tools will work- Understanding of text file injection (flat Files)- Should have the ability to Script (open for language) - Strong understanding of SQL and ability to query- Strong communication skills","Skills":"- Strong Python scripting - coming from a network space (routers/switches) data - Strong analytical mind (understanding full lifecycle of data) - Why and where the data is going and coming from","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":75.6000,"SalaryLow":50.4000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking for a Data Analysis/Data Engineer to join a growing team for a large scale ISP company. This is a chance to join the NetEng organization within the capacity team. You will... be working on a ground up project, playing a major role in bringing the right data to light for the team. This is a chance to use not only your data analytics skills, but also grow your virtualization skill sets and really use your data verification skills. The data for this team will be connected to OSS and BSS systems and a combination of network metrics.

Feb 09, 2024

Bellevue, WA

|

Data Warehousing

|

Contract

|

$75 - $113 (hourly estimate)

{"JobID":338225,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-122.117818181818,"Longitude":47.5576363636364,"Distance":null},"State":"Washington","Zip":"98006","ReferenceID":"SEA-681846","PostedDate":"\/Date(1707495413000)\/","Description":"An employer is looking for a Sr. Data Architect in the Bellevue, WA area. As a Data Architect, you will architect, design, and build enterprise class data centric solutions for this enterprise wide Business to Business group. Your success will be measured on your ability to rapidly develop a keen understanding of Information architecture while working closely with other architecture and development teams to build world class solutions. This position will work very closely Leadership, Principal Architects and Development team members to build out and execute the Data strategy for B2B Systems. Requires competency in customer focus, change \u0026 innovation, strategic thinking, relationship building \u0026 influencing, and results focus.","Title":"Sr. Data Architect","City":"Bellevue","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 10+ Years of experience in enterprise level data focused environments - 5+ Years of experience in a data architecture role- Hands on experience is a must for this role- Experience providing end to end Data Warehouse, Business Intelligence, and/or Big Data Architecture Solutions - Experience with Teradata, Informatica, SQL, or Oracle - Experience with Cloud, Specifically in Azure - Experience with Data Flows, Data Models, Data Mapping, Data quality, and Data Tables - Strong presentation skills, customer service skills, and experience creating and driving proof of concepts- Experience in fast paced development environments, preferably in Agile environments","Skills":"","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":112.8840,"SalaryLow":75.2560,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

An employer is looking for a Sr. Data Architect in the Bellevue, WA area. As a Data Architect, you will architect, design, and build enterprise class data centric solutions for this enterprise wide... Business to Business group. Your success will be measured on your ability to rapidly develop a keen understanding of Information architecture while working closely with other architecture and development teams to build world class solutions. This position will work very closely Leadership, Principal Architects and Development team members to build out and execute the Data strategy for B2B Systems. Requires competency in customer focus, change & innovation, strategic thinking, relationship building & influencing, and results focus.

Jan 18, 2024

West Hollywood, CA

|

Software Engineering

|

Contract

|

$70 - $105 (hourly estimate)

{"JobID":333359,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-118.368181818182,"Longitude":34.1169090909091,"Distance":null},"State":"California","Zip":"90069","ReferenceID":"LAX-676027","PostedDate":"\/Date(1705598191000)\/","Description":"The Data Engineer will be responsible for building, growing and optimizing our data architecture, including database management, data pipeline/ ETL creation and management, and overall data infrastructure management. You will also optimize data flows, maintaining, improving, cleaning, and manipulating data in operational and analytics databases, and working with analytics teams to use databases to build custom analytics for business partners. This leader will be responsible for ensuring that the client\u0027s core data reporting systems are meeting the needs of the business. This team will have a central role in strategy, requirements definition and prioritization for Data Ingestion, Data Accessibility, and Business Intelligence. Responsibilities:- Build and maintain large scale data structures and pipelines to organize data for new and existing projects and data products- Build scalable infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL, GCP, BigQuery, Python, and AWS technologies. - Monitor and optimize data delivery and develop tools for real-time and offline analytic as well as recommend ways to constantly improve data quality and reliability. - Participate in the assessment, selection, and integration processes of our big data platform required to satisfy business needs ensuring that all systems meet business objectives. - Design, construct and maintain disaster recovery procedures. - Collaborate with business intelligence analysts to appropriately use data structures.- Support agile project management processes in a rigorous, results driven environment. - Collaborate in cross-divisional planning and develop accurate work level assessments and timelines and help translate product and feature needs into delivery plans. - Partner with other data engineers and developers to insure data and event tracking specifications, review certified and ad hoc SQL and query optimization.","Title":"Remote Senior Data Engineer","City":"West Hollywood","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 6+ years of experience building scalable applications using Python, SQL and GCP, and Big Query - Experience with GCP tech stack; pulling data from different sources, maintaining history of it, and creating a semantic layer for the data- Strong SQL and Python knowledge for data quality and validation - Experience with real-time (RT) or near real-time (NRT) streaming data - In depth knowledge of cloud database architectures, schema development and data modeling with tools including Big Query, Redshift and/or MongoDB","Skills":"Experience with clickstream or user behavior data In depth experience with Snowflake for processing, data modeling and data mart creationKnowledge of analytic tagging management solutions and mobile tracking platforms such as Google Analytics, Adobe Analytics, Kochava, Adjust, Appsflyer or Tune, etc.Prior business intelligence experience (working on or with BI teams to provide analytics) Background on a high impact, high visibility team in relevant streaming industry (Amazon, Hulu, Disney, etc.)","Industry":"Software Engineering","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":105.3000,"SalaryLow":70.2000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

The Data Engineer will be responsible for building, growing and optimizing our data architecture, including database management, data pipeline/ ETL creation and management, and overall data... infrastructure management. You will also optimize data flows, maintaining, improving, cleaning, and manipulating data in operational and analytics databases, and working with analytics teams to use databases to build custom analytics for business partners. This leader will be responsible for ensuring that the client's core data reporting systems are meeting the needs of the business. This team will have a central role in strategy, requirements definition and prioritization for Data Ingestion, Data Accessibility, and Business Intelligence. Responsibilities:- Build and maintain large scale data structures and pipelines to organize data for new and existing projects and data products- Build scalable infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL, GCP, BigQuery, Python, and AWS technologies. - Monitor and optimize data delivery and develop tools for real-time and offline analytic as well as recommend ways to constantly improve data quality and reliability. - Participate in the assessment, selection, and integration processes of our big data platform required to satisfy business needs ensuring that all systems meet business objectives. - Design, construct and maintain disaster recovery procedures. - Collaborate with business intelligence analysts to appropriately use data structures.- Support agile project management processes in a rigorous, results driven environment. - Collaborate in cross-divisional planning and develop accurate work level assessments and timelines and help translate product and feature needs into delivery plans. - Partner with other data engineers and developers to insure data and event tracking specifications, review certified and ad hoc SQL and query optimization.

Feb 20, 2024

Bentonville, AR

|

Programmer / Developer

|

Contract-to-perm

|

$48 - $72 (hourly estimate)

{"JobID":340280,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-94.1994545454545,"Longitude":36.3642727272727,"Distance":null},"State":"Arkansas","Zip":"72712","ReferenceID":"BEN-684298","PostedDate":"\/Date(1708453009000)\/","Description":"One of our largest national retail clients is looking for a Data Engineer to join their growing organization. You will be a key contributor, 90% of your time will be heads working on scaling an application and making tech upgrades. This application is working continuously to be industry leading so you will get the chance to gain knowledge on the next cutting edge technologies.","Title":"Data Engineer","City":"Bentonville","ExpirationDate":null,"PriorityOrder":0,"Requirements":"6-10 years of Big Data Experience Developing ETL pipelines on Spark Exclusively programming with Scala on Spark within the past year Big QueryGCP","Skills":"TrinoSQL Hive Hadoop Any scheduling tool ie Airflow","Industry":"Programmer / Developer","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":72.0000,"SalaryLow":48.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

One of our largest national retail clients is looking for a Data Engineer to join their growing organization. You will be a key contributor, 90% of your time will be heads working on scaling an... application and making tech upgrades. This application is working continuously to be industry leading so you will get the chance to gain knowledge on the next cutting edge technologies.

Feb 06, 2024

West Hollywood, CA

|

Software Engineering

|

Contract

|

$60 - $90 (hourly estimate)

{"JobID":337220,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-118.364454545455,"Longitude":34.1097272727273,"Distance":null},"State":"California","Zip":"90069","ReferenceID":"LAX-680680","PostedDate":"\/Date(1707239795000)\/","Description":"The Data Engineer will be responsible for building, growing and optimizing our data architecture, including database management, data pipeline/ ETL creation and management, and overall data infrastructure management. You will also optimize data flows, maintaining, improving, cleaning, and manipulating data in operational and analytics databases, and working with analytics teams to use databases to build custom analytics for business partners. This leader will be responsible for ensuring that the client\u0027s core data reporting systems are meeting the needs of the business. This team will have a central role in strategy, requirements definition and prioritization for Data Ingestion, Data Accessibility, and Business Intelligence. Responsibilities:- Build and maintain large scale data structures and pipelines to organize data for new and existing projects and data products- Build scalable infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL, GCP, BigQuery, Python, and AWS technologies. - Monitor and optimize data delivery and develop tools for real-time and offline analytic as well as recommend ways to constantly improve data quality and reliability. - Participate in the assessment, selection, and integration processes of our big data platform required to satisfy business needs ensuring that all systems meet business objectives. - Design, construct and maintain disaster recovery procedures. - Collaborate with business intelligence analysts to appropriately use data structures.- Support agile project management processes in a rigorous, results driven environment. - Collaborate in cross-divisional planning and develop accurate work level assessments and timelines and help translate product and feature needs into delivery plans. - Partner with other data engineers and developers to insure data and event tracking specifications, review certified and ad hoc SQL and query optimization.","Title":"Remote Data Engineer","City":"West Hollywood","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 3-5 years of experience building scalable applications using Python, SQL and GCP, and Big Query - Experience with GCP tech stack; pulling data from different sources, maintaining history of it, and creating a semantic layer for the data- Strong SQL and Python knowledge for data quality and validation - Experience with real-time (RT) or near real-time (NRT) streaming data - In depth knowledge of cloud database architectures, schema development and data modeling with tools including Big Query, Redshift and/or MongoDB","Skills":"Experience with clickstream or user behavior data In depth experience with Snowflake for processing, data modeling and data mart creationKnowledge of analytic tagging management solutions and mobile tracking platforms such as Google Analytics, Adobe Analytics, Kochava, Adjust, Appsflyer or Tune, etc.Prior business intelligence experience (working on or with BI teams to provide analytics) Background on a high impact, high visibility team in relevant streaming industry (Amazon, Hulu, Disney, etc.)","Industry":"Software Engineering","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":89.7000,"SalaryLow":59.8000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

The Data Engineer will be responsible for building, growing and optimizing our data architecture, including database management, data pipeline/ ETL creation and management, and overall data... infrastructure management. You will also optimize data flows, maintaining, improving, cleaning, and manipulating data in operational and analytics databases, and working with analytics teams to use databases to build custom analytics for business partners. This leader will be responsible for ensuring that the client's core data reporting systems are meeting the needs of the business. This team will have a central role in strategy, requirements definition and prioritization for Data Ingestion, Data Accessibility, and Business Intelligence. Responsibilities:- Build and maintain large scale data structures and pipelines to organize data for new and existing projects and data products- Build scalable infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL, GCP, BigQuery, Python, and AWS technologies. - Monitor and optimize data delivery and develop tools for real-time and offline analytic as well as recommend ways to constantly improve data quality and reliability. - Participate in the assessment, selection, and integration processes of our big data platform required to satisfy business needs ensuring that all systems meet business objectives. - Design, construct and maintain disaster recovery procedures. - Collaborate with business intelligence analysts to appropriately use data structures.- Support agile project management processes in a rigorous, results driven environment. - Collaborate in cross-divisional planning and develop accurate work level assessments and timelines and help translate product and feature needs into delivery plans. - Partner with other data engineers and developers to insure data and event tracking specifications, review certified and ad hoc SQL and query optimization.

Feb 15, 2024

Durham, NC

|

Database Administrator (DBA)

|

Perm

|

$120k - $180k (estimate)

{"JobID":339472,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-78.8862727272727,"Longitude":36.0031818181818,"Distance":null},"State":"North Carolina","Zip":"27709","ReferenceID":"RAL-683315","PostedDate":"\/Date(1708017403000)\/","Description":"This position reports to the Data Partnerships, Director of Data and Analytics Platforms. This individual will be primarily responsible for the development of data integration and delivery pipelines while also expanding the FHIR-based content stored within the data lake. These solutions will capitalize on technologies to improve the value of analytical data, improve effectiveness of information stewardship, and streamline the flow of data in the organization. Solutions will focus on using state of the art data and analytics tools including traditional and near real-time integrations, big-data, and delta lake architecture using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will also be comfortable with data science platforms with proven experience leveraging DevOps and Automation/Orchestration tools. Job Responsibilities: *Create and maintain optimal data pipeline architecture *Develop a data lake on Microsoft Azure using the medallion architecture leveraging a delta lake format for the silver layer *Assemble large, complex data sets that meet functional / non-functional business requirements *Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. *Recommend design of analytics solutions which improves data integration, data quality, and data delivery with an eye towards re-useable components *Articulate differences, advantages, and disadvantages between architectural solution methods *Work with Agile team members to document and execute test plans and data validation scripts. Support the code promotion process through development and production as required by using standard CI/CD processes *Develop monitoring, logging, and error notification processes to ensure data is updated as expected and processing metrics reported *Participate in the creation and maintenance of standards for coding, documentation, error handling, error notification, logging, etc. *Accountable for conforming to established architectural, developmental, and operational standards and practices *Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics *Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs *Evaluate and recommend development tools *Assist in application and data operations performance tuning *Participate in system architecture design *Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader *Work with data and analytics experts to strive for greater functionality in our data systems *Share troubleshooting and maintenance duties","Title":"Sr. Data Engineer","City":"Durham","ExpirationDate":null,"PriorityOrder":0,"Requirements":"*Python development experience *Experience implementing data lakes on Microsoft Azure *Design and implementation experience *Experience with data pipeline and workflow management tools such as Azure Data Factory, Synapse Analytics pipelines *Experience with Cloud-based analytics platforms such as Azure Synapse Analytics *HIPPA experience","Skills":"*Experience with Healthcare Data Lakes","Industry":"Database Administrator (DBA)","Country":"US","Division":"Government","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":180000.0000,"SalaryLow":120000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

This position reports to the Data Partnerships, Director of Data and Analytics Platforms. This individual will be primarily responsible for the development of data integration and delivery pipelines... while also expanding the FHIR-based content stored within the data lake. These solutions will capitalize on technologies to improve the value of analytical data, improve effectiveness of information stewardship, and streamline the flow of data in the organization. Solutions will focus on using state of the art data and analytics tools including traditional and near real-time integrations, big-data, and delta lake architecture using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will also be comfortable with data science platforms with proven experience leveraging DevOps and Automation/Orchestration tools. Job Responsibilities: *Create and maintain optimal data pipeline architecture *Develop a data lake on Microsoft Azure using the medallion architecture leveraging a delta lake format for the silver layer *Assemble large, complex data sets that meet functional / non-functional business requirements *Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. *Recommend design of analytics solutions which improves data integration, data quality, and data delivery with an eye towards re-useable components *Articulate differences, advantages, and disadvantages between architectural solution methods *Work with Agile team members to document and execute test plans and data validation scripts. Support the code promotion process through development and production as required by using standard CI/CD processes *Develop monitoring, logging, and error notification processes to ensure data is updated as expected and processing metrics reported *Participate in the creation and maintenance of standards for coding, documentation, error handling, error notification, logging, etc. *Accountable for conforming to established architectural, developmental, and operational standards and practices *Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics *Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs *Evaluate and recommend development tools *Assist in application and data operations performance tuning *Participate in system architecture design *Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader *Work with data and analytics experts to strive for greater functionality in our data systems *Share troubleshooting and maintenance duties

Feb 14, 2024

Tampa, FL

|

Computer Engineering

|

Contract-to-perm

|

$72 - $108 (hourly estimate)

{"JobID":339217,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-82.4692727272727,"Longitude":27.9735454545455,"Distance":null},"State":"Florida","Zip":"33634","ReferenceID":"TPA-683019","PostedDate":"\/Date(1707941806000)\/","Description":"Insight Global is looking for a Senior Data Engineer to join an AI-powered Project Intelligence Solutions company located in Tampa, FL. This company has created a SaaS project tool that will predict project outcomes using machine learning for architecture, engineering and construction companies. This Data Engineer will join a product team of 17 to help do enhancements on the current tool and design and architect the evolution of the solution and allow the tool to provide recommendations to improve the health of the clients\u0027 projects. They will be working heavily with Databricks, Azure and Lakehouse architecture. An ideal candidate will have experience with architect and design but extensive hands on engineering experience that will take up 80% of their day.","Title":"Sr. AZURE Data Engineer","City":"Tampa","ExpirationDate":null,"PriorityOrder":0,"Requirements":"* Bachelor\u0027s degree in CS * 10+ years of experience with designing and developing complex data analytics solutions * 5+ years of experience with Microsoft Big Data solutions * 7 years hands-on experience with Azure products (Listed by priority) Azure Databricks Azure Databricks Unity Catalog Databricks API Databricks Row-level security Databricks error logging Azure Data Lake Gen2 Azure SQL Server Azure Analysis Services * Experience with Lakehouse architecture and design for; multi-tenant? Data isolation techniques OLTP data modeling dimensional data modeling composite modeling data transformation row-level security designing the most optimal analytical data structures for near real-time data analytics * * Experience with Microsoft On-premise SQL Server (2017 or higher) * Azure SQL Server technologies Bbroad experience with SQL Server capabilities and tools ? CDC? Columnstore Index? In-memory Table? SSAS Tabular? DAX? T-SQL? SSIS * Experience using Azure DevOps and CI/CD * Agile tools and processes including Git Azure Boards * Experience with data integration REST APIs Web Services ETL/ELT * Intermediate to advance Experience with Python * Experience in Power BI: * Power BI Services * Power BI Gateway * Power BI Dataflow","Skills":"","Industry":"Computer Engineering","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":108.0000,"SalaryLow":72.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking for a Senior Data Engineer to join an AI-powered Project Intelligence Solutions company located in Tampa, FL. This company has created a SaaS project tool that will predict... project outcomes using machine learning for architecture, engineering and construction companies. This Data Engineer will join a product team of 17 to help do enhancements on the current tool and design and architect the evolution of the solution and allow the tool to provide recommendations to improve the health of the clients' projects. They will be working heavily with Databricks, Azure and Lakehouse architecture. An ideal candidate will have experience with architect and design but extensive hands on engineering experience that will take up 80% of their day.

Feb 15, 2024

Durham, NC

|

QA

|

Perm

|

$88k - $132k (estimate)

{"JobID":339443,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-78.9084545454545,"Longitude":35.9937272727273,"Distance":null},"State":"North Carolina","Zip":"27709","ReferenceID":"RAL-683297","PostedDate":"\/Date(1708013796000)\/","Description":"This position reports to the Data Partnerships Director of Data and Analytics Platforms. This individual will be primarily responsible for the quality of the data ingested into the data lake and the data deidentification processes in the Federated Clinical Analytics Platform (FCAP), including providing quality assurance oversight of the data pipelines and REST APIs developed by the Data Partnerships Data Engineering team and the FCAP partner. The candidate should be an expert in data profiling, root cause analysis, and strategies to ensure the quality of data being delivered through big data and analytics platforms. The QA analyst will be a member of a Data Engineering Scrum team that is delivering solutions focused on using state of the art data and analytics tools including traditional and near real-time data warehousing, big-data, relational and document-based databases using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will be comfortable with data science platforms with proven experience leveraging DevOps and Test Automation tools.Job Responsibilities: *Measure data quality - Design, collect, analyze, and report on data quality assurance / production performance metrics *Manage data quality documentation; define and maintain data standards, definitions, and models *Identify incorrect data, documenting issues, patterns and gaps in the data and/or systems *Provide accurate and appropriate interpretation of data, applying knowledge for evaluation, analysis, and interpretation of data *Ensure data integrity by implementing quality assurance practices *Perform root cause analysis on data issues and recommend data quality controls to resolve gaps/issues *Provide quality assurance oversight of data flows and APIs *Contribute to agile story refinement and estimation. *Contribute as a member of the Data Engineering Scrum Team by developing and executing test cases during sprints to verify that the APIs and data pipelines developed meet defined acceptance criteria *Collaborate with the Scrum Team and DevOps to automate test cases to be incorporated in the CI/CD pipelines *Participate in the creation and maintenance of standards for coding, documentation, error handling, error notification, logging, etc.","Title":"QA Analyst","City":"Durham","ExpirationDate":null,"PriorityOrder":0,"Requirements":"*3-5 years of data-focused QA experience using and testing the following software/tools:oAPI testing experience with PostmanoRelational SQL and NoSQL databases, including Oracle, SQL Server, Postgres, and CosmosDB in AzureoData pipeline, workflow management, ELT tools such as Azure Data Factory and Synapse PipelinesoObject-oriented/object function scripting languages such as Python or Java oRESTful APIs and Web ApplicationsoAutomating testing experience with Azure DevOps *Bachelors Degree","Skills":"*Python experience *Prior experience in healthcare IT *Data Lake \u0026 data ingestion experience","Industry":"QA","Country":"US","Division":"Government","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":132000.0000,"SalaryLow":88000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

This position reports to the Data Partnerships Director of Data and Analytics Platforms. This individual will be primarily responsible for the quality of the data ingested into the data lake and the... data deidentification processes in the Federated Clinical Analytics Platform (FCAP), including providing quality assurance oversight of the data pipelines and REST APIs developed by the Data Partnerships Data Engineering team and the FCAP partner. The candidate should be an expert in data profiling, root cause analysis, and strategies to ensure the quality of data being delivered through big data and analytics platforms. The QA analyst will be a member of a Data Engineering Scrum team that is delivering solutions focused on using state of the art data and analytics tools including traditional and near real-time data warehousing, big-data, relational and document-based databases using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will be comfortable with data science platforms with proven experience leveraging DevOps and Test Automation tools.Job Responsibilities: *Measure data quality - Design, collect, analyze, and report on data quality assurance / production performance metrics *Manage data quality documentation; define and maintain data standards, definitions, and models *Identify incorrect data, documenting issues, patterns and gaps in the data and/or systems *Provide accurate and appropriate interpretation of data, applying knowledge for evaluation, analysis, and interpretation of data *Ensure data integrity by implementing quality assurance practices *Perform root cause analysis on data issues and recommend data quality controls to resolve gaps/issues *Provide quality assurance oversight of data flows and APIs *Contribute to agile story refinement and estimation. *Contribute as a member of the Data Engineering Scrum Team by developing and executing test cases during sprints to verify that the APIs and data pipelines developed meet defined acceptance criteria *Collaborate with the Scrum Team and DevOps to automate test cases to be incorporated in the CI/CD pipelines *Participate in the creation and maintenance of standards for coding, documentation, error handling, error notification, logging, etc.

Jan 25, 2024

Saint Cloud, MN

|

Software Engineering

|

Contract

|

$52 - $78 (hourly estimate)

{"JobID":334837,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-94.1583636363636,"Longitude":45.5651818181818,"Distance":null},"State":"Minnesota","Zip":"56301","ReferenceID":"MSP-677790","PostedDate":"\/Date(1706206586000)\/","Description":"Insight Global is looking for an individual to join a team of data scientists and cloud engineers to assist working on a cloud-based platform. This individual needs to have experience with building platforms with machine learning and has had experience transitioning from one cloud to another. In this role you will be joining the team to implement data catalogs, vetting effective tools and developing the tools.","Title":"Azure Cloud Engineer","City":"Saint Cloud","ExpirationDate":null,"PriorityOrder":0,"Requirements":"6-12 years of experience in designing, developing and implementing cloud solutions using AzureExperience within machine learning, big data, analytics, etc6+ years of experience with PythonExperience using data-orientated workflow orchestration frameworks","Skills":"Experience with Docker, Terraform, Purview, Azure Data Factory, and other Azure servicesExperience with CICD pipelines","Industry":"Software Engineering","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":78.0000,"SalaryLow":52.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking for an individual to join a team of data scientists and cloud engineers to assist working on a cloud-based platform. This individual needs to have experience with building... platforms with machine learning and has had experience transitioning from one cloud to another. In this role you will be joining the team to implement data catalogs, vetting effective tools and developing the tools.

1 - 10 of 18