About the Role

Brief Description        

In this role, you will create reliable data architecture for building highly scalable data pipelines to collect large amounts of data from different sources and transform the data into usable format for analysis and reporting 9 Grade H09)

Detailed Description        

Principal Accountabilities (KEY Performance areas):

Accountability (Responsibility) 

  • Work with IT Business Analysts, data owners and other business users to understand business data requirements required to develop advanced analytics solutions. 
  • Create and maintain real time data pipelines which will involve data sourcing, data acquisition, extraction, transformation, profiling, storage, updating, partitioning, indexing and maintenance. 
  • Develop and automate data quality and monitoring reports as part of a data governance strategy. 
  • Process both raw structured and unstructured data at scale into a form suitable for analysis and reporting. 
  • Work with Technology Operations team to resolve data issues through extensive performance tuning and optimizing the application of data across all advanced analytics layers (acquisition, staging, profiling, cleansing, analysis, modelling, output). 
  • Implement data management projects by working with key stakeholders (Technology and Business) and provides expert input, guidance, and feedback in such projects. 
  • Develop and maintain technical documentation and manuals on configurations, setups, and deployment of various advanced analytics solutions. 
  • Perform peer review of developed data pipelines for quality checks and ensure quality output

Job Requirements        

Qualification (Minimum) 

  • Bachelor’s degree in computer science, computer Technology ,Information Technology, Computer Engineering, or related degree Additional Qualifications Training and certification in Oracle Data Integrator, Informatica or any other Data Integration Tool Certifications in Linux and Python Scripting 

Years of Experience (Minimum) 

  • Proficiency in SQL (Oracle PL/SQL), Python, Data Modelling, Data Warehousing and Wrangling tools, Languages and Techniques 
  • Proficiency in Big Data technologies (Map/Reduce, Hadoop, Kafka, NiFi, Pig, Hive, Spark), Elastic Search 
  • Minimum of 3 years proven performance in Business Intelligence & Analytics as an ETL Developer or Data Engineer working with tools such as Oracle Data Integrator and Oracle SOA 
  • Experience with relational data modeling and databases, data management and data processing 
  • Broad understanding of the latest trends in Data Analytics 
  • Be an authority in collecting, organizing, analyzing, and storing large datasets from various sources (Relational Databases, text and PDF files, websites etc.) with attention to detail and accuracy. 
  • Have an inquisitive nature with an aptitude to diagnose and tackle analytically complex business problems.        

Other Skills 

  • Good understanding of Software Development Life Cycle process 
  • Experience in integrated data logistics platform (NiFi,Kafka) Behavioral competencies 
  • Ability to excel in a fast paced, team-oriented environment. 
  • Good troubleshooting skills and ability to work independently/ proactively are a must. 
  • Excellent verbal and written communication; effective liaison skills and the ability to work with a wide range of professionals in various disciplines. 
  • Commitment to and has a high affinity for delivering the highest level of customer satisfaction. 
  • Motivated and energetic individual, who takes initiative, enjoys finding solutions to a varying number of challenges, is detail-oriented, and takes extreme pride in their work.

How to Apply

https://i-pride.kenya-airways.com/OA_HTML/OA.jsp?page=/oracle/apps/irc/candidateSelfService/webui/VisVacDispPG&OAHP=IRC_EXT_SITE_VISITOR_APPL&OASF=IRC_VIS_VAC_DISPLAY&akRegionApplicationId=821&transactionid=987268789&retainAM=N&addBreadCrumb=RP&p_svid=952&p_spid=46008&oapc=14&oas=vB2dITZ9vyYEjYq_Otu_Tw..&utm_source=MyJobMag

Follow Us on Social Media