Returning Candidate?

Data Engineer

Data Engineer

Job ID 
Posted Date 
Amazon Web Services, Inc.
Position Category 
Business Intelligence
Recruiting Team 

Job Description

AWS Global Business Operations (GBO) team is looking for a data engineer to play a key role in building their industry leading Customer Information Analytics Platform. If you have experience in building and maintaining very large data warehouses with high transaction volumes then we need you!!! The Data Engineer will design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures for business intelligence analytics and deep learning. Implement data structures using best practices in data modeling, ETL/ELT processes and AWS Redshift, Oracle, and OLAP technologies. Provide on-line reporting and analysis using OBIEE business intelligence tools and a logical abstraction layer against large, multi-dimensional datasets and multiple sources. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions that work well within the overall data architecture. Produce comprehensive, usable dataset documentation and metadata. Provides input and recommendations on technical issues to the project manager.

Basic Qualifications

- 2+ years of experience with and detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
- Demonstrated strength in architecting data warehouse solutions and integrating technical components
- 4+ years of experience with relational and star schema data modeling concepts
- Strong analytical skills with excellent knowledge of Oracle, SQL and PL/SQL.
- 4+ years of work experience with very large data warehousing environment
- Excellent communication skills, both written and verbal

Preferred Qualifications

- Experience in gathering requirements and formulating business metrics for reporting.
- Experience with Informatica, OBIEE, AWS Redshift is preferred.
- Expert understanding of ETL techniques and best practices to handle extremely large volume of data.
- Good understanding of the cloud industry.
- Ability to leverage Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.)