The Senior Data Architect is a key member of Investments Data Technology team responsible for helping design and implement our enterprise wide data ecosystem on the cloud. You will be responsible for providing input and guidance for the entire life-cycle of data solutions on the cloud (including big-data). This includes (but is not limited to) requirements analysis, modeling, recommending industry best practices, performing POCs, data security, integration, and recommending and developing appropriate tool-sets for analytics.
Provide thought leadership and create solutions on a cloud-based data platform that cover data security, data governance, data modeling, integration, scalability etc. using native cloud technologies. Develop current to future state architecture road-maps. Align and promote enterprise technology (data, infrastructure, applications, and security) principles and strategy.
Use appropriate design techniques and methodologies to translate project requirements into detailed designs, consistent with platform strategy and road map. Design external system interface architecture that includes appropriate application of techniques and standards. Determine integrated hardware and software architecture solutions that meet performance, usability, scalability, reliability, security and business/functional requirements.
Create effective and efficient control patterns, conduct risk evaluations, and enforce policies and standards to enable the enterprise to conduct business at an appropriate level of risk. Research and recommend technology to improve current systems. Ensure capabilities are reviewed and compliant with appropriate levels of risk respective to hardware/software currency, performance/availability, security and information/transaction integrity and drive the awareness of required improvements.
Maintain deep expertise in architectural modeling, frameworks, emerging technologies, and best practices. Mentor and develop less experienced technical staff. Provide technical leadership in both business-as-usual and crisis conditions. Recommend new procedures and processes to drive desired results on diverse projects. Perform industry research with external parties and maintain networks amongst peers.
Develop and communicate governance parameters that take into account stakeholders' interests and processes with measurable results. Partner with other architects to ensure alignment and integration across portfolio boundaries and promote an enterprise focus on applications management. Develop and build strong relationships within and across the lines of business and use effective communication skills to influence and accomplish strategic application architecture objectives
Bachelors degree or equivalent in Computer Science, Engineering, or related technical field; or related work experience.
7-10 years of relevant experience required.
Broad experience across architecture practices (data, infrastructure, applications and security) with depth of experience in at least one area.
Knowledge of Enterprise-wide architecture and ability to 'see the big picture' and how it affects current and future technologies.
Internal/External consulting experience that spans organizational boundaries and includes influencing technology and business leaders.
Depth of experience in the architecture, design, implementation and support of complex enterprise solutions.
Exposure to multiple, diverse technical configurations, technologies, and process improvements
Familiarity with the various business facets of asset management (especially datasets within the asset management business)
Prior experience with working on an enterprise Data Lake/Data Pipeline as a Data Architect, Solutions Architect or Senior Software Engineer
Expertise with data modeling, designing, and implementing data models for disparate datasets
Expertise in ETL/ELT tools and concepts
Expertise with cloud technologies such as AWS Glue, S3, IAM, Lambda, Redshift, DynamoDB, Cloud Watch, Lake Formation etc.
Programming experience with Python is a plus
Working knowledge of DevOps process, automation tools and proficiency in Continuous Integration and Continuous Delivery pipeline models
Must possess a strong desire to work hands-on developing Data Lake, Data Lakehouse, Data Warehousing, reporting, security, and performance management solutions for enterprise implementations
Demonstrated problem solving and quantitative skills, combined with excellent written and verbal communication skills are required
Demonstrated ability to explain highly technical solutions to technical and non-technical audience.