Job description
- Experiences data engineer skilled in designing scalable data architecture, analyzing complex data sets, and developing robust ETL Pipelines
- Proactive problem solver with a strong understanding of data modelling of database management, capable of ensuring data quality and integrity across multiple systems and platforms.
- Design, develop, implement, and run cross-domain, modular, optimized, flexible, scalable, secure, reliable, and quality data solutions (Data Pipeline, Data model, BI Reports and Dashboards) that transform data for meaningful analyses and analytics while ensuring operability.
- Work closely with subject matter experts, business analysts, data architects, other developers, and
- Participates in vendor/strategic partner evaluations and monitors the relationship on an ongoing basis.
- stakeholders to gather requirements and design data integration solutions.
- Develop and maintain Python scripts for data manipulation and processing.
- Embrace continuous learning of engineering practices to ensure industry best practices and technology adoption, including Python, Cloud (AWS), Snowflake, DevOps and Agile thinking.
- Participate in code reviews and ensure adherence to coding standards and best practices.
- troubleshoot and resolve issues with data integration processes.
- Create and maintain high-quality technical documentation of data definitions, transformation, and processes to ensure data governance and security.
- Bachelor’s degree: Computer Science, MIS or Mathematics are preferred majors.
- Experience working in IT preferably within a financial services company, a with minimum of 3 years of IT experience primarily in data.
- At least 3 years of experience with Python Development and data manipulation libraries (e.g., Pandas, NumPy, SciPy)
- Experience working in virtualized cloud environments including cloud-based IaaS/SaaS/PaaS solutions. (AWS Preferred).
- 1+ years of experience with CI/CD tools such as Jenkins, GitHub, etc.
- Experience with web service technologies (REST, SOAP, etc.) for application integration.
- Knowledge of data cleaning, wrangling, visualization, and reporting.
- Knowledge of strategies for processing large amounts of structured and unstructured data, including integrating data from multiple sources.
- Exposure to databases, BI applications, data quality and performance tuning.
- Expertise in high-volume streaming data platforms
- Understanding of ETL methodologies and Data Warehousing principles, approaches, technologies, and architectures including the concepts, designs, and usage of data warehouses and data marts
- Experience in writing complex SQL queries and stored procedures.
- Knowledge of data warehousing, OLAP, multi-dimensional, star and Snowflake schemas
- Knowledge and experience with database design principles including referential integrity, normalization, and indexing to support application development.
- A culture of innovation, empowerment, decision-making, and accountability
- Comprehensive health and welfare benefits that serve the needs of you and your family and foster a culture of wellness
- Additional benefits and amenities, including paid time-off programs (vacation, sick leave, and holidays)
- Hybrid work environment for most positions
About Element Fleet Management
CEO: Laura Dottori-Attanasio
Revenue: $2 to $5 billion (USD)
Size: 1001 to 5000 Employees
Type: Company - Public
Website: https://www.elementfleet.com/
Year Founded: 1946