Tyndale USA

Data Engineer

Job Locations US-PA-Pipersville
Job ID
2025-2286
# of Openings
1
Posted Date
4 weeks ago(2/14/2025 8:49 PM)
Category
Information Technology

Overview

The Tyndale Company, an 8x Top Workplace winner in PA/4x Top Workplace winner in TX and certified woman-owned business, is seeking a skilled Data Engineer to join our growing AI organization. This role will play a critical part in our mission to harness data for advanced AI and Machine Learning applications. You will collaborate with data scientists, and business stakeholders to build and maintain robust data pipelines, ensuring our data architecture supports the full lifecycle of AI initiatives. This is an exciting opportunity to shape the data foundation of our AI and Machine Learning efforts, driving impactful insights and innovations.

 

HYBRID/REMOTE: Tyndale supports a strong work-life balance. This opportunity requires onsite work a minimum of 1 day per week, and 4 days per week remotely. To be considered, candidates must reside within a commutable distance from our corporate office location in Pipersville, PA (Bucks County).

 

Please note: we are not partnering with outside agencies. 

Responsibilities

  • Data Pipeline Development: Design, build, and optimize data pipelines to support AI and ML projects. Enable real-time and batch data processing to meet diverse business needs.
  • Data Integration and Management: Integrate data from various sources, including our ERP, SaaS platforms, and external APIs, to provide a unified data view for AI applications.
  • Data Quality Assurance: Implement processes to ensure data quality, consistency, and accuracy across systems, working closely with data governance standards.
  • Collaboration with Data Science and AI Teams: Partner with data scientists to understand data requirements for model development, deployment, and monitoring.
  • Performance Optimization: Monitor and improve data pipeline performance, optimizing ETL processes, storage, and query performance as needed.
  • Documentation and Best Practices: Develop and maintain documentation for data workflows and adhere to best practices for data engineering, including version control and testing.

Qualifications

  • Bachelor’s degree in Computer Science, Statistics, Mathematics, Data Science, or a related field. 
  • Minimum of 3 years of experience in data engineering, with proven experience in data pipeline development, data integration, and ETL processes.
  • Proficient in SQL, Python, and data warehousing solutions and experience with AWS services required.
  • Understanding of AI and ML concepts and familiarity with machine learning model lifecycle requirements.
  • Strong understanding of data governance, data architecture, and best practices for data management in a modern AI ecosystem.
  • Strong communication skills, capable of working cross-functionally with technical and non-technical stakeholders.
  • Experience in B2B ecommerce or apparel industry preferred.
  • Experience with data visualization tools and MLOps a plus.

 

Benefits:

  • Health & Wellness: Comprehensive medical, dental, and vision insurance with competitive premiums. Paid parental leave. Mental health support through an EAP and partial reimbursement on copays, fertility support, and robust wellness programs with annual reimbursements.
  • Work-Life Balance: Many positions with Tyndale offer hybrid onsite + remote work schedules, generous PTO, paid holidays + a floating holiday, and more.
  • Financial Compensation: Competitive salary, 401(k) with matching, and bonus opportunities.
  • Career Growth & Development: Training/certification/tuition reimbursement programs and demonstrated paths for knowledge share and internal promotion opportunity.
  • Culture & Perks: Family-owned values, award winning culture, team-engagement events, casual dress code, company-sponsored charitable events and activities, and an inclusive workplace that values collaboration and integrity.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed