Big Data Support Engineer
Dar es salaam,
Dar es salaam, Tanzania
Responsible for all development activities required to deliver Big data solutions that meet the Business needs.
Responsibilities* Part of Big data delivery team to help implement the new data lake platform; define & implement a data integration and workflow framework that provides for the loading of data, data integration, quality management, aggregation, joins etc. and distribution to various systems using advanced Big data technologies.
* Develop and Design data processing workflows adhering to our platform guidelines and standards in order to satisfy the functional and non-functional requirements for extract, load and transformation routines.
* Ensure, in collaboration with Solution Designer and platform manager, scalable and high performance solutions.
* Provide application support for IT related problems in the area of data processing workflows.
* Develop and maintain scalable data management and analytic solutions using Hive, Hbase , Informatica etc.
* Participate in the research, design and development of scaling strategies for our very large and growing data assets
* Hands on technical role; contribute to all phases of the software development lifecycle, including the analysis, architecture, design, implementation, and QA
* Constant learning and application of the newest features to ensure up to date solutions.
* Write technical documentation to ensure maintainability of developed application.
* Support Analysis & Design process from a technical and feasibility point of view.
* Ensures peer reviews and unit tests are prepared and conducted across the entire team.
* Responsible for provision of effort estimates required to develop software.
* Transition Structured Query Language (SQL) and file system application data stores to a Big Data scalable framework where appropriate.
* Ensures compliance with statutory and company procedures, including security policies.
* Maintains an understanding of relevant industry trends and current knowledge of the Big data technology deployed in the business area.
* Follows all change control procedures and puts in place full version control and configuration management to ensure that the production service is not compromised.
Must Have* A very strong SQL/data analysis or data mining background, experience with Business Intelligence, Data Warehousing, common/reusable components, Audit Controls, ETL Framework.
* Solid understanding of large scale data management environments (relational and/or NoSQL).
* +3 Years of experience in building massively scalable distributed data processing solutions with Hadoop, HBase & Hive.
* Demonstrable knowledge of Continuous Integration/Test Driven Development and Code Analysis including its application within the software development lifecycle.
* +2 years of experience in the development of ETL routines preferably Informatica Big Data edition.
* Experience in the development and configuration of RESTAPIs.
* Experience in agile IT development methodologies (e.g:. SCRUM).
* Prior experience working in a financial services environment desirable, preferably insurance
* Ability to work effectively in a global organization, under time pressure and to manage ambiguity.
* Bachelor Degree (IT), equivalent
* Good English skills (Written & Oral).
* Strong knowledge of Informatica PowerCenter toolset.