Data Analyst — Data Validation Specialist
ERNI

Founded in 1994 and headquartered in Switzerland, ERNI is a leading Software Development company with over 800 employees worldwide. Specializing in IT and software engineering, we drive innovation in process and technology. Our first service center in Asia Pacific, located in Metro Manila (Mandaluyong), supports clients across Europe, APAC, the Philippines, and the USA. As we continue to grow, we're looking for passionate and motivated individuals to join our team.
Why ERNI is the Perfect Place for You:
• International Exposure: Work with global clients on cutting-edge projects.
• Inclusive Culture: Thrive in a collaborative and diverse work environment.
• Career Development: Enjoy continuous learning and professional growth opportunities.
Perks and Benefits:
• Career Stability: Enjoy a stable career path with ample project opportunities.
• Skill Enhancement: Access free training and certifications.
• Baby Basket: To welcome your newborn to the ERNI family.
• Fruit Basket: Boost of vitamins during hospitalization.
• Office Perks: Enjoy free snacks and coffee.
Growth and Opportunities:
• Free Training: Advance your skills through technical and non-technical training.
• Challenging Projects: Engage in complex software projects across MedTech, Industry,
Finance, and Transportation.
• Supportive Environment: Benefit from a team dedicated to guiding and supporting your success.
• Recognition and Advancement: Receive acknowledgment for your efforts and
opportunities for promotion.
• Open Communication: Experience transparency and value your input in our culture.
Flexibility:
• Hybrid Work Setup: Balance remote and in-person work for better work-life integration.
Events:
• Connect and Celebrate: Participate in a variety of events including leisure, summer,
family, social, and year-end gatherings.
What are our wishes:
Qualifications:
Area
Details
Experience
3–5 yrs data-quality / validation work, incl. at least 1 Azure lakehouse or migration project
Platforms & Tools
Azure Databricks (PySpark, Delta Lake) • Azure Data Factory • Great Expectations / dbt tests
Coding & Querying
Advanced SQL • Python for data testing & automation
Analytics & Viz
Power BI (or Tableau) to publish quality dashboards
Methodology
Deep grasp of data-quality dimensions & root-cause analysis; CI/CD for data tests
Soft Skills
Meticulous, proactive communicator, strong stakeholder management
Nice-to-Have
Azure Synapse or Fabric • Kafka/Event Hub ingestion • Data catalog tools (Purview, Collibra) • Regulatory / audit exposure • Agile/Scrum
How can you contribute to the team?
Core Duties
- Validation Framework Design – Define data-quality rules, test cases, and acceptance criteria for migrated and incremental loads in Databricks.
- Automated Testing – Build PySpark/SQL tests and Great Expectations (or dbt tests) notebooks; schedule them with ADF pipelines and alerts.
- Issue Triage & Resolution – Diagnose defects, work with data engineers to remediate and re-run pipelines, and verify fixes in lower & prod tiers.
- Monitoring & Alerting – Create quality dashboards and SLA monitors in Power BI and Databricks SQL, surfacing metrics like freshness, nulls, and schema drift.
- Standards & Documentation – Maintain validation playbooks, data dictionaries, and quality KPIs; evangelize best practices across migration squads.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
IT Service Desk Analyst (AU Aged Care, Hybrid)

DE029518 - Procurement Practice Associate Manager

Senior Learning and Development Specialist (AU Mortgage, Hybrid)
