C

AWS Data Engineer

Capgemini
Full-time
On-site
San Francisco, California, United States

  

Software Engineers perform requirements analysis.  They then design, develop or maintain the physical application (components) or the application environment, based on the Software Architecture (models and principles). Activities include coding, integrating, implementing, installing or changing frameworks and standard components, or technical and functional application management. A Software Engineer also develops languages, methods, frameworks and tools, and/or undertakes activities in support of server-based databases in development, test and production environments.

 

 

Required Skills and Experience:

 

 

You are an experienced Software Engineer. You have received training and mastered at least one technology environment. You are good at elaborating technology areas and have an ability to position them within the scope of an overall project. You are a member of at least one community.

 

 

• Qualification\: 3-7 years (2 years min relevant experience in the role), Bachelor’s Degree.
• Certification\: Should have or seeking SE Level 1.
• Should have progressing knowledge in Business Analysis, Business Knowledge, Software Engineering, Testing, Data Management, Architecture Knowledge and Technical Solution Design.

 

 

                                                                                               

Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.

 

  

This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.

  

 

Click the following link for more information on your rights as an Applicant - http\://www.capgemini.com/resources/equal-employment-opportunity-is-the-law
 
  

About Capgemini

 

Capgemini is a global leader in consulting, digital transformation, technology and engineering  services. The Group is at the forefront of innovation to address the entire breadth of clients’ opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year+ heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. Today, it is a multicultural company of 270,000 team members in almost 50 countries. With Altran, the Group reported 2019 combined revenues of €17billion.
  
Visit us at www.capgemini.com. People matter, results count.
•    6+ years of experience in data pipeline engineering for both batch and streaming applications
•    Must be hands-on coding capable in at least a core Language skill of (Python, Java or Scala) with Spark
•    Expertise in working with Distributed DW and Cloud Services (like Snowflake, Redshift, AWS, etc.) via scripted pipelines for at least 2 years
•    This role intersects with ‘Big Data’ stack to enable varied Analytics, ML, etc. Not just ‘DW’ type workload
•    Business Domains like Sales & Marketing, Direct to Consumer, Adsales of particular interest
•    Handling large and complex sets of XML, JSON, and CSV from various sources and databases
•    Solid grasp of database engineering and design
•    Leveraged frameworks & orchestration like Airflow as required  for ETL pipeline
•    Identify bottlenecks and bugs in the system and develop scalable  solutions
•    Unit Test & Document deliverables
•    Capacity to successfully manage a pipeline of duties with minimal supervision
 
Required Technical Skills:
•    Very high test score – Python, SQL, DW Concepts & Logic
•    a core Language skill of (Python, Java or Scala) with Spark
•    Variety of EC2, EMR, RDS, Redshift, Snowflake
•    Strong SQL
•    Other SQL based databases, like Oracle, SQL Server, etc.
•    AWS Cloud knowledge
•    Nice to have\:  Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
 
Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment.
 
Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.