• IdealStaffs

Data engineer (AWS cloud technologies)

Updated: Jan 7

Client: United Nations

Location: Valencia, Spain or Remote

Estimated Start Date: As soon as possible

Position: Consultant

To apply for this position, please send your resume at office@idealstaffs.com referencing the job title.

IdealStaffs Consulting is looking for a passionate Data Engineer who is well versed in AWS cloud technologies for ETL modeling, data warehouse and data lake design/building and data movement to join our expanding Data & Analytics team. The consultant will be responsible for supporting existing data warehouse customers, as well as performing a senior lead role for new projects to implement various data solutions in AWS. The position can be remote.

The scope of work includes:

• Participate in the project specification and software design phases

• Design and Implement AWS architecture, AWS Solutions services and tools

• Design Native Cloud Application Architectures or optimize applications for AWS

• (S3, SQS, Lambda Comprehend, Transcribe, Faregate, Aurora, API GW , CFT ,SAM to build a data flow pipeline)

• Data visualization skills in Tableau or AWS QuickSight are a plus, but not required.

• Collaborating within a project team to solve complex problems

• Ensuring the delivered solution meets the technical specifications and design requirements

• Be responsible for meeting development deadlines

• Perform other duties as required

The consultant shall possess the following certifications:

  • Certified AWS Developer - Associate or

  • Certified AWS DevOps – Professional (Nice to have) or

  • Certified AWS Big Data Specialty (Nice to have)

The resource should also have:

  • AWS Certified Cloud Practitioner (Preferred, not required)

  • Minimum of 5 years of experience with Data warehousing methodologies and modelling techniques

  • Minimum of 2 years of experience working in Massively Parallel Processing (MPP) Analytical Datastores such as Teradata

  • General understanding of the Snowflake architecture

  • Minimum of 1 year of experience in handling semi-structured data (JSON, XML) using the VARIANT attribute in Snowflake

  • Minimum of 3 years of experience in creating master data datasets. Experience with MDM tool is plus

  • Minimum of 2 years of experience in Migration, methods to cloud data solutions

  • Minimum of 3 years of experience in working with Batch and Stream data

  • Minimum of 5 years of experience with SQL

  • Minimum of 2 years of hands-on experience in Cloud technologies such as

- AWS - S3, Glacier, EC2, Lambda, SQS, Redshift, Athena, EMR, AWS Glue, AWS Lake Formation, Kinesis, AWS Batch

- Azure - Blob Storage, Cool Blob Storage, Virtual Machine, Functions, SQL Datawarehouse, DataFactory, Databricks, CosmosDB

  • Minimum of 3 years of experience with ELT concepts

  • Minimum of 3 years of experience with ETL tools such as Informatica

  • Experience in Hadoop, Hive, HBASE, Spark is a plus

  • Minimum of 5 years of RDBMS experience

  • Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CodePipeline

  • CodeDeploy, etc.

  • Bachelors or higher degree in Computer Science or a related discipline.

  • Experience with containers (docker)

  • Strong Python programming skills

  • Strong scripting skills: Bash Shell and PowerShell

  • General understanding and hands-on experience with Terraform

320 views0 comments

+34 643795676

©2019 by IdealStaffs Consulting