Website Recology
Role of Senior Data Engineer
Under minimal direction, develops and maintains data lake repository, data warehouse, and data mapping infrastructure with the ultimate goal to make data accessible so that the organization can use it to evaluate and optimize their performance. Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
Essential Responsibilities
- Creates and maintains optimal cloud data platforms, pipeline architecture and storage systems.
- Ensure data environments are secure, scalable, and well‑governed.
- Develop and support data ingestion, transformation and orchestration workflows.
- Assembles large, complex data sets (structured, unstructured, and text-based) that meet functional / non-functional business requirements.
- Applies analytics techniques to detect anomalies, patterns and insights.
- Performs unit testing, mock data generation and data validation.
- Uses best design practices for analytics data assets, including data modeling, data schemas, data integrity, data quality, security and performance.
- Participates in agile software planning and development activities, including daily standups, user story and task organization and grooming activities, and effort estimation.
- Analyzes business and technical requirements to develop documentation, designs, code, and tests.
- Maintains source control hygiene (branch protection, git-flow) and CI/CD practices includes build, test, and automated deployment (like Jenkins, Travis, and DevOps).
- Develops AI/ML tools, vector databases, and LLM-related components.
- Other duties as assigned.
Qualifications
- 8+ years of experience in a data engineer role.
- High school diploma or GED required.
- Bachelor’s degree preferred.
- Preferred Azure Fabric experience and Azure cloud expertise.
- Significant cloud engineering skills with experience working with SQL and no-SQL database technologies.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases.
- Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
- Experience reviewing and executing plans and performance tuning.
- Experience working with common languages and frameworks such as Python and PySpark.
- Full-stack development experience with in-depth understanding of modern application design principles, DevOps & Microservices.
- Relevant certifications from cloud providers.
- Strong written and verbal communication skills, including ability to convey proposed solutions through natural language, diagrams, and design patterns.
- Functions independently as a senior technical specialist with leadership and mentoring capability; willingness to jump in, learn new tech and work effectively across the full stack.
To apply for this job please visit phg.tbe.taleo.net.