Skip to main content

Senior Engineer (Data Platform)

Posted: 5 Mar 2021 Closes: 13 Jun 2021
Sofia FT Competitive
Permanent & Full-time JR006239

Position Summary 

An exciting opportunity for a senior engineer to join the Product & Technology department, of a growing business and a globally recognized brand.  Working in teams with Sofia or in distributed teams in London and Sofia, you will be responsible for delivering innovative technical solutions as you build and operate world-class platforms to grow the Financial Times strategic business models. You will have the autonomy to select the tools and technologies you need to build and operate services responsible for FT brand critical capabilities.  Someone who is comfortable with the ever-changing technical landscape and is keen to contribute to the company’s processes and broader know-how would thrive in this role. To help you understand in more detail the Financial Times’ Data Platform team’s goals and projects take a look at this article.

Main Duties and Responsibilities
  • Work directly with product owners and senior stakeholders to fully shape solutions from inception to deployment and beyond

  • Work within (with an opportunity to lead) a team of engineers in an agile delivery team

  • Development of high-quality data solutions within the Financial Times’ Data Platform

  • Design and implement low maintenance, well-monitored, secure and scalable solutions to customer problems

  • Design, build and operate solutions, from cradle to grave, which meets both functional and non-functional KPIs

  • Understand and play an active part in designing the architecture, tooling and release cycle processes used by the engineering teams across Product & Technology

  • Contribute to company-wide processes, frameworks, and guidelines

  • Develop an in-depth understanding of FT’s underlying data and data flow, data structures.

  • Develop a close relationship with our customers and provide operational support

Person Specification (Candidate Profile): 

Essential

  • Good command of written and spoken English

  • Extensive experience in a Data Engineer/Data Warehouse Development role involving hands-on experience in data warehousing/data technologies/business intelligence within a significantly-sized DW/BI project.

  • Experience in modern database technologies (AWS/cloud-based/in-memory etc.), scripting languages, big data technologies.

  • Experience working with data sources of varying volumes, variety and velocity

  • Experience in designing and developing ETL solutions using both out-of-the-box ETL vendor technologies and SQL

  • Highly proficient in at least one of the  programming languages relevant to Data at the FT: Python, SQL, Java

  • Will have a track record of delivering well-engineered solutions using current technologies and best practices such as SOLID, TDD, CI/CD, and pair programming

  • An active member of the broader technology community with an understanding of current leading trends

  • Experience working as part of an Agile delivery team, using methodologies like Scrum and Kanban

  • Good understanding of the principles and trade-offs of a microservices architecture

  • Good working experience of at least one cloud infrastructure, ideally AWS

  • Comfortable working in a Linux environment

Desirable

  • Experience in the role of the technical lead of a multi-disciplined group of engineers in an agile delivery team

  • Knowledge of optimization techniques like indexing/performance tuning on both relational and columnar databases

  • Experience working within an environment where operational support and monitoring of code and systems is part of the culture (DevOps)

  • Experience in additional data warehouse and business intelligence tools and technologies

  • Experience designing and developing RESTful APIs

  • Ability to accurately monitor and analyze system performance using tools like Grafana

  • Will hold an industry certification in a key platform, tool, or domain used within Engineering at the FT. This could be any relevant certification but currently, we are recommending it's an AWS Associate certificate

  • Experience productionizing Machine learning algorithms or Data science models

  • Experience with streaming applications such as Kafka streams, Spark streaming

  • Experience in working with ETL frameworks (job orchestration tools) such as Airflow or Luigi

Personal Attributes

  • Inquisitive, innovative, lateral thinking

  • Thorough/attention to detail

  • A proactive and determined approach to problem-solving

  • A passion and aptitude for working with business data

  • Capable of prioritizing multiple streams of work, in varying states of completion

  • Have a passion for the web, digital and emerging technologies

  • Be a natural advocate for data and BI

  • A natural collaborator and team player 

Financial Times

Share Job