Job description
Role Title: Lead Data Engineer
BU/Segment: Analytics
Location: United Kingdom (1-2 days per week onsite at clients Weybridge Office)
Employment Type: Permanent
Summary of the role:
The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyze, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching.
Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges
Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team.
Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable.
As part of your duties, you will be responsible for:
- Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc.
- Work on SAS data migration project to Azure cloud.
- Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc.
- Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs.
- May serve as project or DI lead, overseeing multiple consultants from various competencies
- Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration
- Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes
- Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate.
- Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels
- Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik.
- Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices.
- 10 Years industry implementation experience with data integration tools such as Microsoft SSIS, SAD, Azure Data Factory, Databricks, Glue, Step Functions, Airflow, Apache Flume/Sqoop/Pig, etc.
- 3-5 years of management experience required
- 3-5 years consulting experience preferred
- Minimum of 5 years of data architecture, data modelling or similar experience
- Must have worked on SAS data migration project earlier
- Bachelor’s degree or equivalent experience, Master’s Degree Preferred
- Strong data warehousing, OLTP systems, data integration and SDLC
- Strong experience in big data frameworks & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar, along with experience in libraries / frameworks to accelerate code development
- Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.)
- Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.)
- Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar
- Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.)
- Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data.
- Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP)
- Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms
- 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc.
Skills and Personal attributes we would like to have:
- Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc.
- Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation
- Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions.
- Ability to provide technical direction to other team members including contractors and employees.
- Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together.
- Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results
- Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.
- Can create documentation and presentations such that the they “stand on their own”
- Can advise sales on evaluation of Data Integration efforts for new or existing client work.
- Can contribute to internal/external Data Integration proof of concepts.
- Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered.
- Ability to work independently on projects as well as collaborate effectively across teams
- Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success
- Strong team building, interpersonal, analytical, problem identification and resolution skills
- Experience working with multi-level business communities
- Can effectively utilize SQL and/or available BI tool to validate/elaborate business rules.
- Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues.
- Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data.
- Demonstrates a complete understanding of and utilizes DSC methodology documents to efficiently complete assigned roles and associated tasks.
- Deals effectively with all team members and builds strong working relationships/rapport with them.
- Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution.
- Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point.
- Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.
A competitive salary with a generous bonus, private healthcare, critical illness life assurance at 4 x your annual salary, income protection insurance, and a rewarding pension.- EXL provides everyday financial well-being solutions, such as cash back cards, in which you can earn cashback while enjoying discounts, promotions, and offers from top retailers. We also offer a Cycle Scheme where you can save money on bikes and cycling accessories.
- At EXL, we are committed to providing our employees with the tools and resources they need to succeed and excel in their careers. We offer a wide range of professional and personal development opportunities. We also support a range of learning initiatives that allow our employees to build on their existing skills and knowledge. From online courses to seminars and workshops, our employees have the opportunity to enhance their skills and stay up to date with the latest trends and technologies.
- As an Equal Opportunity Employer, EXL is committed to diversity. Our company does not discriminate based on race, religion, colour, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, age, or disability status.
- EXL employees are eligible to purchase stock as part of our Employee Stock Purchase Plan (ESPP).
- At EXL, we offer a flexible hybrid working model that allows employees to live a balanced, healthy lifestyle while strengthening our culture of collaboration.
About EXL Service
CEO: Rohit Kapoor
Revenue: $2 to $5 billion (USD)
Size: 10000+ Employees
Type: Company - Public
Website: www.exlservice.com
Year Founded: 1999