Locations: Will also consider Jersey City or NYC
The Information Technology group delivers secure, reliable technology solutions that enable us to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance.
The Data Engineering Platform and Delivery (DEPD) team designs and engineers secure data platforms, services and capabilities to meet application and business needs.
- Set direction and strategy for the Data Analytics, AI and Data Management platforms and tools team
- Determine requirements for and adopt new technologies and capabilities
- Analyze IT industry and market trends to determine impact to enterprise
- Establish team priorities and assign work to team members.
- Manage the team to meet balanced scorecard targets by monitoring team progress towards metric targets and actively participating in Continuous Improvement-related initiatives to minimize unfavorable variances.
- Direct and ensure smooth workflow within the Team by staying informed and in touch with key elements of the organization beyond their Team and communicating key messages and updates in a timely manner.
- Maintain constructive relationships within and outside the Team.
- Promote the professional development of Team.
- Manage the Demand Management process by assisting in developing and maintaining future demand projections (i.e., 6-12 months) to identify future staffing requirements and scheduling start and completion dates based on agreed scope of work.
- Supports all efforts to effectively monitor, track and report progress of GOCS 3 Year Strategy accomplishments.
- Oversee the technical quality of the projects by ensuring that key technical procedures, standards, quality control mechanisms, and tools are properly utilized including performing root cause analyses for technical problems and engaging in work product quality review.
- Actively participate in the Change Control process by identifying potential scope variances by monitoring work requirements definition, issue resolution, development progress and user sign-off and reviewing change requests.
- Manage team work budgets.
- Mitigate risk by following established procedures, spotting key errors and demonstrating strong ethical behavior.
- Mitigates risk by following established procedures, spotting key errors and demonstrating strong ethical behavior.
- Minimum of 10 years of related experience
- Bachelor's degree required. Master's degree preferred.
- Experience with database, analytics and data science related technologies such as Tensorflow and AWS Data Science tools such as SageMaker.
- Experience with Business Intelligence, Analytical and Reporting tools (Power BI, Tableau, Cognos, etc)
- Engineering data infrastructure environments and data management tools
- Good knowledge of data management, data integration and database development techniques
- Proven data literacy, i.e. the ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.