Data Engineer
Tenet Healthcare

Benefits:
- 15% Night differential
- 20 Paid Time Off (PTO) per year
- Annual Appraisal
- Annual Incentive
- Hybrid Work Arrangement
- Weekends off
- HMO with FREE 2 dependents
- Group life insurance
Position Summary
The Data Engineer is primarily responsible for participating in the design, development, and implementation of solutions for the data warehouse. The primary focus of the data warehouse includes data and analytics that support business operations including Clinical, Hospital Performance, and Corporate functions.
This role is cloud facing and requires knowledge in building and managing data engineering code in modern cloud-based technologies such as Google Big Query or equivalent. We are looking for a high energy individual willing to learn and evolve and would like to contribute to high impact healthcare environment.
Job Duties and Responsibilities:
- Assembling small to medium complex sets of data that meet non-functional and functional business requirements
- Building required data pipelines for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies
- Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics
- Working with stakeholders including data, design, and product teams and assisting them with data-related technical issues
- Participates in unit and system testing and follow existing change control processes for promoting solutions to production system, escalating issues as needed.
- Design, develop, review, test, and deploy using CI/CD to facilitate the data warehouse solutions.
- Identifies data integrity issues and analyzes data and process flows for process improvement opportunities.
- Monitoring system performance and evaluate query execution plans for improving overall system performance.
- Working with Integration Architect, develop, test and deploy the data pipelines.
- Participate in troubleshooting and maintaining existing solutions as required
Qualifications:
- Bachelors’ degree from an accredited college/university in technology related field or equivalent combination of education, training, and experience.
- Ability to build and optimize data sets, ‘big data’ data pipelines.
- Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
- Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata
- Strong SQL and relational database design/development skills
- Development experience with cloud-based modern data warehouses such as Google Big Query or equivalent.
- Applicants should have a demonstrated understanding and experience of relational SQL databases including Big Query or equivalent, and function/object-oriented scripting languages including Scala, Java and Python.
- Applicants should also have an understanding of software and tools including big data tools like Kafka, Spark; workflow management tools such as Airflow.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
Tax Manager

Intern - Data Insighting and Communications

SAP FICO Senior Manager
