- Fintech Client
- 12-Month Contract
- Location: Hybrid – 3 days per week in the office in South Dublin
- Daily rate: €550 - €575
The client is looking for a Senior Data Engineer to help lead the technical design and build of a new Analytic Foundation, a suite of API-based services (e.g., prediction, matching, forecasting) underpinned by a robust data platform. These tools will deliver actionable insights from a centralised data environment and will be core to multiple strategic products.
As a key engineering contributor, you’ll collaborate closely with product managers, designers, and peer engineers to build scalable, performant data products that power decision-making for major financial institutions.
What You’ll Do:
- Lead the design and development of complex features in a full-stack, agile environment
- Build analytics and data models to support scalable applications and services
- Develop intuitive UIs for customer-facing tools and dashboards
- Write and review clean, performant, well-tested code
- Mentor junior engineers and support team growth
- Collaborate across teams to deliver cohesive product experiences
- Drive process improvements and best practices in data engineering
Key Technologies:
- Languages/Frameworks: Python, Scala, Java, Spring Boot, .NET/C#, React, Redux, TypeScript
- Big Data & Tools: Spark, Hadoop, Hive, Impala, Airflow, NiFi, Sqoop
- Databases: SQL Server, Databricks SQL
- Other: Machine Learning, Predictive Analytics, API Development
What We're Looking For:
- 5+ years of experience in full-stack or data engineering in a production environment
- Strong hands-on expertise with Hadoop platform, Python PySpark, and Databricks SQL
- Proven experience leading large, cross-functional technical initiatives
- Ability to work fluidly across business, technical, and data teams
- Experience with end-to-end ML pipelines and big data workflows
- Strong mentoring and leadership skills
- Passion for problem solving and building impactful data products
- Degree in Computer Science or a related technical field
Top 3 Must-Haves:
- Experience with Hadoop ecosystem
- Python with PySpark
- Databricks SQL