Do you want your voice heard and your actions to count?
Discover your opportunity with Mitsubishi UFJ Financial Group (MUFG), the 5th largest financial group in the world (as ranked by S&P Global, April 2018). In the Americas, we’re 14,000 colleagues, striving to make a difference for every client, organization, and community we serve. We stand for our values, developing positive relationships built on integrity and respect. It’s part of our culture to put people first, listen to new and diverse ideas and collaborate toward greater innovation, speed and agility. We’re a team that accepts responsibility for the future by asking the tough questions and owning the solutions. Join MUFG and be empowered to make your voice heard and your actions count.
Program Summary:
MUFG Americas is embarking on a business and technology transformation to effectively deliver five key business imperatives: Growth, Business Agility, Client Experience, Effective Controls, and Teamwork. To accomplish these imperatives, MUFG has launched a Transformation Program built upon the following foundation pillars:
- Core Banking Transformation Program
- Data Governance, Infrastructure & Reporting Program
- Technology Modernization Program
This position supports the Core Banking Transformation (CBT) Program. CBT is a multi-year effort to modernize our deposits platform with a premier digitally-led and simplified ecosystem for consumer, small business, commercial and transaction banking to deliver exceptional customer experience and provide the bank a high-reaching advantage in the market. Our customers will benefit from streamlined and automated processes that simultaneously will provide the bank business process efficiencies and operational cost savings.
The Core Banking Transformation technology team seeks a hardworking Data Engineer who is collaborative and passionate about solving complex data engineering problems. This role is responsible for design, build, implementation, monitoring, and management of the MUFG Core Banking data services gateway that provides the foundations for the technology modernization and digital transformation.
As a data platform engineer, you will focus on building the firm’s next generation data environment. You will be a key player in creating a data services platform that drives real-time decision-making in service of our customers. You will develop, build, and operate the platform using DevSecOps and System Reliability Engineering (SRE) methods
- Work closely with architecture teams to select, design, develop and implement optimized solutions and practices
- Create and maintain optimal data pipeline architecture, responsibilities include the design, implementation, and continuous delivery of a sophisticated data pipeline supporting development and operations
- Gather and process large, complex, raw data sets at scale (including writing data pipelines, scripts, calling APIs, write SQL queries, etc.) that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Analyze complex data / data models and focus on the data research of multi-functional requirements, source and target data model analysis to develop and support the end-to-end data mapping effort
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using streaming, pipeline, SQL/NoSQL technologies.
- Proficient in usage of distributed revision control systems with branching, tagging (git). Create and maintain release and update processes using open source build tools
- Develop and deliver ongoing releases using tiered data pipelines and continuous integration tools like Jenkins
- Solid experience with environment and deployment automation, infrastructure-as-code, deployment data pipeline specification and development.
- Work with partners including the Business, Infrastructure and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Be a data authority to strive for greater functionality in our data systems.
- Responsible for production readiness and all operational aspects of the new data services that will support critical MUFG applications
- Partner with Risk Management and Security team to identify the standards and required controls, and lead the design, build, and rollout secured and compliant data services to support MUFG critical business applications and workload
- Partner with application and DBA teams to experiment, design, develop and deliver on-premise as well as cloud native solutions and services, and power the digital transformations across business units
- Embrace Infrastructure-as-Code, and leverage Continuous Integration / Continuous Delivery Pipelines to run the full data service lifecycle from release of data service offerings into production through the retirement thereof
- Participate in software and system performance analysis and tuning, service capacity planning and demand forecasting
- Has the ability to write infrastructure, application and data test cases and participate in code review sessions.
- Performance analysis and tuning of infrastructure and data processing
- Provide Level 3 support for troubleshooting and services restoration in Production
Management or Supervision: Yes
Qualifications:
- Bachelor’s degree in computer science or related field, or equivalent professional experience
- 7-10 years of meaningful technical experience, with at least 5+ years of experience in design, development and delivery of critical data solutions in large complex IT environment, poses Experienced level skills in 3 or more of the following areas:
- Data Warehouse, Data Mart and Data Vaults
- Data Backup / Restore, Replication, Disaster Recovery
- Data field encryption and tokenization
- Application design / develop / test experience with RDBMS and/or NoSQL
- Database Administration experience with Relational and NoSQL databases
- Metadata management
- Data Services solution design and implementation experiences in on-premise or cloud native environment, poses Expert level skills in 4 or more of the following areas:
- Experience with relational SQL and NoSQL databases, including Postgres, DynamoDB etc. Experience with data pipeline and workflow tools: Wherescape Streaming, Wherescape RED, StreamSets Data Collector etc.
- Experience with stream-processing systems: Kafka, AWS Kinesis, Apache Storm, Spark-Streaming, etc.
- A successful history of manipulating, processing and extracting value from large disconnected datasets with ETL and Data engineering know how of SQL, Informatica PowerCenter or similar.
- Experience with secure cloud services platform for Data Management and Integration
- Experience with object-oriented/object function scripting languages: Python, Java, C# etc.
- Awareness of data governance aspects like metadata, business glossaries, data controls, data protection, canonical models, etc
- Experience with container orchestration technologies such as Docker, Kubernetes, Openshift
- Proven experience with Open Source software including OpenShift, Jenkins, PostgreSQL, etc.
- Strong scripting experience with automating processes and deployments using tools such as scripting (bash, python, perl, etc.)
- Familiar with DevOps toolchain, i.e. BitBucket, JIRA, Jenkins Pipeline, Artifactory or Nexus, and experienced in automate and deploy n-tier application stack in cloud native environments
- Excellent data & system analysis, data mapping, and data profiling skills
- Demonstrate good understanding of modern, cloud-native application models and patterns
- Excellent collaboration skills and a passion for problem solving, with the ability to work alternative coverage schedules
- Strong verbal and written communication skills required due to the dynamic nature of collaboration with leadership, customers, and other engineering teams
- Bachelor’s degree in Computer Science, or a related field
- Experience within a high integrity, and/or regulated environment (government, healthcare, financial sectors, etc.)
- AWS professional level certifications is preferred but not required
The above statements are intended to describe the general nature and level of the work being performed. They are not intended to be construed as an exhaustive list of all responsibilities, duties, and skills required of personnel so classified.