About Wells Fargo Wells Fargo & Company (NYSE: WFC) is a leading global financial services company headquartered in San Francisco (United States). Wells Fargo has offices in over 30 countries and territories. Our business outside of the U.S. mostly focuses on providing banking services for large corporate, government and financial institution clients. We have worldwide expertise and services to help our customers improve earnings, manage risk, and develop opportunities in the global marketplace. Our global reach offers many opportunities for you to develop a career with Wells Fargo. Join our diverse and inclusive team where you will feel valued and inspired to contribute your unique skills and experience. We are looking for talented people who will put our customers at the center of everything we do. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you.
Market Job Description
About Wells Fargo India and Philippines Wells Fargo India and Philippines enables global talent capabilities for Wells Fargo Bank NA., by supporting over half of Wells Fargo's business lines and staff functions across Technology, Business Services, Risk Services and Knowledge Services. Wells Fargo operates in Hyderabad, Bengaluru and Chennai in India and in Manila, Philippines. Learn more about Wells Fargo India and Philippines at our International Careers website
Departmental Overview The Enterprise Data Lake (EDL) group focuses on leveraging data as a strategic asset across Wells Fargo. Data is critical to Wells Fargo's continued success at cross-selling, deepening relationships with our customers, creating a seamless and convenient multi-channel experience, and meeting our commitment to satisfy our customers' financial needs. EDL defines how we use information across the company, align our business practices and support our continued focus on great customer experience.
About the Role: The EDL Hadoop Big data team is looking for an Engineer to work in a fast-paced agile development environment to act as a Big data lead developer for Hadoop technologies. This Big data team currently develops all the big data applications using Hortonworks Hadoop platform. The Hortonworks Team serves as the customer point of contact for all Big Data client requests.
Key responsibilities: The job responsibilities for the job include but are not limited to:
- Hands on Hadoop development and implementation, provide senior SME knowledge/support to members of the team
- Engage and lead solution architecture discussion to direct future product roadmap
- Analyze highly complex business requirements, designs and writes technical specifications to design or redesign complex computer platforms and applications.
- Act as an expert technical resource for modeling, simulation and analysis efforts.
- Verify program logic by overseeing the preparation of test data, testing and debugging of programs.
- Support overall systems testing and the migration of platforms and applications to production.
- Maintain and support existing HortonWorks component for data ingestion.
- Develop new documentation, departmental technical procedures and user guides.
- Assures quality, security and compliance requirements met for supported area and oversees creation of or updates to and testing of the business continuation plan.
- Loading from disparate data sets.
- Assures quality, security and compliance requirements met for supported area and oversees creation of or updates to and testing of the business continuation plan.
- Translate complex functional and technical requirements into detailed design.
- Test prototypes and oversee handover to operational teams.
- Propose best practices/standards.
Required Qualifications:
- 10+ years of application development and implementation experience
- 7+ years in Hadoop solution development
- 5+ Years in Hive, Hbase, Kafka, Streams, HDFS, Ranger and other Hadoop components.
- 7+ years in Java development
- 3+ years in Spark
- 5+ years in development on Linux environment
- 3+ years of Teradata experience
- 5+ years of data warehouse experience
- 7+ years of SQL experience
- 3+ years of HortonWorks experience
- Experience in Python, Shell, Scala, Ansible and any other scripting languages.
- Experience with Private and public cloud technologies (AWS, Google cloud) – Optional great to have.
Desired Qualification:
- Good verbal, written, and interpersonal communication skills
- Experiences in build and deployment tools; GitHub, Jenkins, uDeploy (Experience in DevOps)
- Strong presentation and reporting skills required to convey current state, challenges, needs and wins to garner support for ongoing adhoc and scheduled initiatives.
The EDL Hadoop Big data team is looking for an Engineer to work in a fast-paced agile development environment to act as a Big data lead developer for Hadoop technologies. This Big data team currently develops all the big data applications using Hortonworks Hadoop platform. The Hortonworks Team serves as the customer point of contact for all Big Data client requests.
Key responsibilities: The job responsibilities for the job include but are not limited to:
- Hands on Hadoop development and implementation, provide senior SME knowledge/support to members of the team
- Engage and lead solution architecture discussion to direct future product roadmap
- Analyze highly complex business requirements, designs and writes technical specifications to design or redesign complex computer platforms and applications.
- Act as an expert technical resource for modeling, simulation and analysis efforts.
- Verify program logic by overseeing the preparation of test data, testing and debugging of programs.
- Support overall systems testing and the migration of platforms and applications to production.
- Maintain and support existing HortonWorks component for data ingestion.
- Develop new documentation, departmental technical procedures and user guides.
- Assures quality, security and compliance requirements met for supported area and oversees creation of or updates to and testing of the business continuation plan.
- Loading from disparate data sets.
- Assures quality, security and compliance requirements met for supported area and oversees creation of or updates to and testing of the business continuation plan.
- Translate complex functional and technical requirements into detailed design.
- Test prototypes and oversee handover to operational teams.
- Propose best practices/standards.
Required Qualifications:
- 10+ years of application development and implementation experience
- 7+ years in Hadoop solution development
- 5+ Years in Hive, Hbase, Kafka, Streams, HDFS, Ranger and other Hadoop components.
- 7+ years in Java development
- 3+ years in Spark
- 5+ years in development on Linux environment
- 3+ years of Teradata experience
- 5+ years of data warehouse experience
- 7+ years of SQL experience
- 3+ years of HortonWorks experience
- Experience in Python, Shell, Scala, Ansible and any other scripting languages.
- Experience with Private and public cloud technologies (AWS, Google cloud) – Optional great to have.
Desired Qualification:
- Good verbal, written, and interpersonal communication skills
- Experiences in build and deployment tools; GitHub, Jenkins, uDeploy (Experience in DevOps)
- Strong presentation and reporting skills required to convey current state, challenges, needs and wins to garner support for ongoing adhoc and scheduled initiatives.
We Value Diversity At Wells Fargo, we believe in diversity and inclusion in the workplace; accordingly, we welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We comply with all applicable laws in every jurisdiction in which we operate. |