AI Data Engineer
Globe Telecom
Date: 10 hours ago
City: Taguig
Contract type: Full time

At Globe, our goal is to create a wonderful world for our people, business, and nation. By uniting people of passion who believe they can make a difference, we are confident that we can achieve this goal.
Job Description
We are seeking an AI Data Engineer with specialized expertise in Databricks and comprehensive experience implementing bronze, silver, and gold data pipelines. You will leverage your skills in Snowflake, Google BigQuery (GBQ), Python, and Google Cloud Platform (GCP) to build, optimize, and maintain robust data solutions supporting AI and analytics workloads. In this critical role, you'll collaborate closely with data scientists, AI specialists, cloud engineers, and business stakeholders to deliver scalable, secure, and performant data platforms.
Duties And Responsibilities
Data Pipeline Development & Optimization
Education
KPIs
Design, implement, and optimize structured bronze, silver, and gold data pipelines on Databricks.
Develop and manage scalable data warehouses leveraging Snowflake and Google BigQuery for analytics and AI workloads.
Create robust Python-based automation scripts to streamline data processing, transformations, and integrations.
Establish and maintain effective data governance frameworks, ensuring high data quality, security, and regulatory compliance.
Architect and deploy scalable GCP infrastructure to integrate seamlessly with Databricks and analytics platforms.
Equal Opportunity Employer
Globe’s hiring process promotes equal opportunity to applicants, Any form of discrimination is not tolerated throughout the entire employee lifecycle, including the hiring process such as in posting vacancies, selecting, and interviewing applicants.
Globe’s Diversity, Equity and Inclusion Policy Commitment can be accessed here
Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.
Job Description
We are seeking an AI Data Engineer with specialized expertise in Databricks and comprehensive experience implementing bronze, silver, and gold data pipelines. You will leverage your skills in Snowflake, Google BigQuery (GBQ), Python, and Google Cloud Platform (GCP) to build, optimize, and maintain robust data solutions supporting AI and analytics workloads. In this critical role, you'll collaborate closely with data scientists, AI specialists, cloud engineers, and business stakeholders to deliver scalable, secure, and performant data platforms.
Duties And Responsibilities
Data Pipeline Development & Optimization
- Design, build, and maintain data pipelines using Databricks, implementing bronze, silver, and gold data layers.
- Continuously optimize data ingestion, transformation, and loading processes to improve performance, reliability, and scalability.
- Ensure high data quality standards through robust validation, auditing, and governance frameworks.
- Manage and optimize data solutions on Snowflake and Google BigQuery, ensuring efficient querying and resource utilization.
- Develop strategies to migrate, integrate, and synchronize data between various cloud data warehouses.
- Implement best practices for cloud data management, ensuring cost-effective and secure operations.
- Leverage Python to automate data processes, streamline workflows, and develop efficient data transformations.
- Build and manage automation scripts and workflows for data extraction, cleaning, and loading into various platforms.
- Collaborate with AI teams to develop data integration points and data access layers supporting machine learning workloads.
- Utilize GCP services (Cloud Storage, BigQuery, Dataflow, Pub/Sub, Composer) to architect and deploy scalable data systems.
- Integrate GCP infrastructure seamlessly with Databricks and other analytics environments.
- Ensure robust cloud infrastructure monitoring, logging, and alerting mechanisms to proactively identify and mitigate data pipeline issues.
- Establish comprehensive data governance practices, ensuring compliance with regulatory standards (e.g., GDPR, HIPAA).
- Implement robust data security practices including encryption, role-based access control, and auditing mechanisms.
- Collaborate closely with security and compliance teams to maintain secure data operations.
- Partner with data scientists, engineers, analysts, and business stakeholders to understand and fulfill data infrastructure requirements.
- Clearly communicate complex data engineering concepts and solutions to technical and non-technical audiences.
- Actively participate in agile methodologies, contributing to sprint planning, retrospectives, and continuous improvement initiatives.
Education
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related fields (or equivalent experience).
- 3-5+ years of experience as a data engineer, with specific expertise in Databricks and structured data pipeline architectures (bronze, silver, gold).
- Demonstrable experience managing data warehouses and data lakes with Snowflake and GBQ.
- Proficient in Databricks platform management, data pipeline construction, and optimization.
- Strong expertise in Snowflake and GBQ, including data modeling, query optimization, and performance tuning.
- Advanced proficiency in Python, particularly for data manipulation, ETL processes, and automation.
- Extensive experience with GCP data services including Cloud Storage, BigQuery, Pub/Sub, and Dataflow.
- Understanding of AI and analytics data requirements, including data preparation and feature engineering.
- Experience building data solutions supporting machine learning model training, validation, and deployment.
- Evidence of successful data engineering projects involving Databricks and cloud data warehouses.
- Examples demonstrating complex pipeline management and data architecture contributions.
KPIs
- Timely completion of bronze, silver, and gold data pipeline implementations.
- Data pipeline availability and reliability (measured in uptime percentage and successful job completions).
- Efficiency and performance improvement in data ingestion and query execution times.
- Reduction in data-related incidents, including quality and security breaches.
- Improved compliance levels with data governance and security standards (GDPR, HIPAA).
Design, implement, and optimize structured bronze, silver, and gold data pipelines on Databricks.
Develop and manage scalable data warehouses leveraging Snowflake and Google BigQuery for analytics and AI workloads.
Create robust Python-based automation scripts to streamline data processing, transformations, and integrations.
Establish and maintain effective data governance frameworks, ensuring high data quality, security, and regulatory compliance.
Architect and deploy scalable GCP infrastructure to integrate seamlessly with Databricks and analytics platforms.
Equal Opportunity Employer
Globe’s hiring process promotes equal opportunity to applicants, Any form of discrimination is not tolerated throughout the entire employee lifecycle, including the hiring process such as in posting vacancies, selecting, and interviewing applicants.
Globe’s Diversity, Equity and Inclusion Policy Commitment can be accessed here
Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
Sr. Freight Support Manager
Uber,
Taguig
14 hours ago
#Greatmindsdontthinkalike : At Uber, we take pride in our diversity and working environment that sees you as more than just a person that can do the job, but an outstanding individual that can level up our organization with a perspective only you can offer. Uber provides a truly open culture that encourages all to voice their thoughts.About The Team Community...

(CSR) Customer Service Representative - AU/NZ Account
HR TechX Corp.,
Taguig
15 hours ago
Job Description:Respond to customer inquiries via phone, email, or chat in a professional and timely mannerResolve customer issues efficiently while ensuring high levels of customer satisfactionMaintain accurate records of customer interactions and transactionsFollow up on customer interactions when necessaryEscalate complex issues to the appropriate team or supervisorQualifications:3+ years of BPO experience (preferably supporting AU/NZ accounts)Good communication skills with neutral accent...

SAP Academy for Services & Consulting (APAC) - Project Manager Associate - Philippines
SAP,
Taguig
1 day ago
We help the world run betterAt SAP, we keep it simple: you bring your best to us, and we'll bring out the best in you. We're builders touching over 20 industries and 80% of global commerce, and we need your unique talents to help shape what's next. The work is challenging – but it matters. You'll find a place where...
