• Watch Interview of Chairman - JumpStart Pakistan
  • Post A Free Job

Data Engineer

Data Engineer will use various methods to extract, and transform raw data into data lakes/data warehouses and strive for efficiency by aligning data systems with business goals. To succeed in this data engineering position, he/she should have strong analytical skills and the ability to extract, load and transform data from various sources into data lakes and data warehouses. Data engineer skills also include familiarity with several programming languages (preferably Python or R), SQL, building ETL or ELT pipelines, and managing data lakes, data warehouses, and data marts.

Duties/Responsibilities:

Extract data from multiple sources (Cloud/On-Prem) and ingest it into a data lake (AWS S3) through different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.

Data cleaning, cleansing, and transformation.

Build and maintain data lakes, data warehouses, and data marts on AWS as per the business requirements.

Building data catalog on AWS Glue.

Build data pipelines and workflows to ingest raw data and transform/clean data into data lakes and data warehouses respectively using AWS Glue.

Conduct complex data analysis and report on results.

Explore ways to enhance data quality and reliability.

Building and maintaining data catalog.

Evaluate business needs and objectives.

Interpret trends and patterns.

Required Skills/Abilities:

Previous experience as a data engineer or in a similar role.

Technical expertise with data models, data scrapping, data cleansing, and segmentation techniques.

Knowledge and understanding of Amazon Web Services such as AWS Glue (Crawler, Job, Database, Workflow), AWS S3, AWS App flow, AWS Athena, AWS Lambda, etc.

Knowledge and experience in connecting with multiple data sources using different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.

Knowledge and experience of Python and PySpark.

Knowledge and experience in SQL and SparkSQL queries.

Knowledge of MS Excel and ability to build various views using pivot tables.

Great numerical, statistical, and analytical skills.

Data engineering certification will be a plus.

Knowledge and experience in Beautiful Soup/Selenium/Scrappy will be a plus.

Knowledge and experience on Terraform will be a plus.

Education and Experience:

Bachelor’s degree in Computer Science or equivalent required.

2+ years of progressive experience in working on AWS services.

for more details, contact us at

fahim@hotizontech.biz

Apply For This JOB
Industry :
Functional Area :
Location :
Salary :
Market Competitive
Gender :
Male
Work Type :
Full Time
Age :
20-30
Education :
Graduate
Years of Experience :
2-3
Apply By :
31 of Mar 2023

   Your application has been submitted successfully

More jobs from Horizon technologies
Loading Results