Working on The Central Security Delivery Group into Security Services as part of the first line of defence of
CISO organization. Same time collaboration as Machine Learning Consultant on the scope of innovation
The role undertakes multiple activities helping the Information Security services organization in mitigating
Information Security risks and improve the Bank´s security posture. Analysis of information golden sources
and develop ETL processes (Tableaux, SAP Business Objects, SQL, Python and VBA Excel) to produce
reporting and intelligence around vulnerability remediation, and compliance monitoring in different security
programs. 65 indicators/metrics across different program:
o Application Security Remediation Program (ASR Program)
o Security Patching Program
o Host Security Monitoring Program
o Network Security Infrastructure Monitoring Program
o Network Segregation Program
o End-Security Program
o DLP Program
Collaboration with Global Markets team as a consultant on various projects related to the development of
classification, regression, and reinforcement learning Models in the fields of Stock Exchange and currency
Rates values prediction, Markets tendencies and portfolio optimization. Collaboration with Research and
Development team in Program Security logging analytics where we introduce ML and NLP techniques to
analyse logs and detects patterns of attacks and anomalies.
Deutsche Bank AG, London and Frankfurt
Predictive Modelling to solve Univariate Time Series Problems. Design, code and implement models to
forecast markets prices and currency rates. Different approaches use like statistical Modelling with
techniques like ETS decomposition, ADF test, ACF/PACF test, autoregression, moving average and
ARIMA/SARIMA models. Supervised learning classification and regression models to predict trends and
prices using SciPy and Scikit-learn libraries with different techniques of boosting/Voting and
stacking/blending. Deep Learning Recurrent Neural Networks and LTSM using TensorFlow and Keras to
Feature selection of different technical / fundamental signals in use with Supervised learning to predict
underperformers and overperformers. Combining these signals into trading strategies and optimised
portfolio allocations to maximize Sharpe Ratio and Compound Annual Growth Rates. Use of technical and
fundamental signals as source features of machine learning models to generate buy and sell signals and
create with these signals trading strategies. Introduction of sentimental analysis and sentimental scores to
screen and select the right investments for each portfolio model. Introduce the sentimental scores as another
decision feature to define when to go short or large in different investment strategies (Stockstwits)
Recommendation Engine for Advisory Clients. Design IT architecture, algorithms and Models of a tool to
recommend financial products for clients with advisory mandates. Hybrid tool Content Based +Collaborative
filtering and ALS algorithm with implicit preference. Creating implicit ranking based in the different capital
gains and income obtained for clients with similar investment/risk profiles. Experimental use of anonymous
rankings based on sentimental analysis and short/medium term predictions for products on scope.
Design Management Portfolio Optimizer: Optimize portfolio allocation to maximize profits on managed
Portfolios and maximize profit on client pockets reducing tax fees on Income and Capital Gains.
Natural Language Processing: Nltk, Spacy and Genism libraries to solve common classification problems
like spam filtering or document classification. BoW, tf, tf-idf and n-gram embeddings.
Big Data / Spark: Spark-Session and Spar-Context, Supervised and Unsupervised learning (ML and Mllib),
Recommendation engines and Spark Streaming libraries. Distributed deep learning models at scale with
Spark (Elephas and TensorflowonSpark libraries): Data-parallel training of deep learning models, Distributed
hyper-parameter optimization and Distributed training of ensemble models
3 projects experience including, web contents SEO, recommendation system and search experience improvement
E-commerce Search Engine business logics implementation - Java
E-commerce recommendation & Machine Learning (ML) models integration - Java
Web service performance optimization - Jmeter, Fiddler, PRTG, Webpagetest
Jenkins (yHudson) CI build process, SVN source control
Red Hat Linux administration, shell scripting - bash, awk, grep
(Soft skills) Agile/Scrum master, iteration planning and scheduling
Leading and participating hands on in the Design and Development (Java, Scala, Python, SQL) of Big Data platform for storage, processing and machine learning analytics of Market Data Time Series Data and Structured Data from Upstream Line of Business Systems – both Batch and Real-Time analytics.
Responsible for spearheading the wide spread use or Machine Learning Analytics on Big Data in the organization.
Design and implementation of scalable Machine Learning and Statistical Analysis Algorithms throughout a range of Spark based analytics modules – SparkMLLib, TensorFlow, H2O, Apache Mahout, Python (SciKit), R
Design and development of Spark based ETL Data Processing Pipelines and data models and data management and query software components for HBASE, Hive and Impala
Development of data driven User Interfaces in D3.js
Lead role in interpreting and analysing business use-cases and feature requests into technical designs and development tasks.
Responsible for ensuring highly interactive response times by identifying and preventing performance bottlenecks to creep into the system.
Big Data / Hadoop Cluster Infrastructure Design, Sizing, Capacity Planning and hands on deployment and configuration.
Design and hands on implementation of Kafka Message Broker Cluster - nodes, storage, topic / partition / replication structure, message keys for partitioning, message schemas and compression, system monitoring and management etc
Key member of the programme board responsible for definition, planning and analysis of the new Treasury and Finance Target Operating Model.
Led the design of real-time, Hadoop based, Finance and Treasury Liquidity Risk Management platform featuring parallel processing of batch and streaming data feeds, financial modelling, stress testing and Business Intelligence functions
Led the design of Data Aggregation, Processing and Distribution Platform, Business Data Protocols and Interfaces between the Enterprise Data Hub and Treasury, FO Trading, Collateral Management, Market Data Management, Risk and Valuation Engines, Funding Services, Reference Data, Financial Reporting, Product Management and Transaction Management.
Design and implementation of data processing and statistical analysis algorithms within the Hadoop framework for handling both data in storage and streaming data. Design of real-time and batch processing, compute job automation and cluster resource management.
Design of federated data views across data in storage and streaming data.
Design of message and data set based data integration and data distribution solutions linking the Enterprise Data Hub with line of business systems and consumers. Design of interfaces and data management solutions for SAP Finance / General Ledger and Hadoop.
Design of data visualization solutions operating on data directly in the Enterprise Data Hub
Design, Implementation and Organization of ongoing Hadoop Operations - Hadoop Cluster Planning and Installation, Configurations, Policies and Procedures for Identity and Access Management to Hadoop data, Resource Management, Cluster Maintenance, Monitoring, Performance Tuning, Troubleshooting and Backup and Recovery
Design of the storage, networking and virtualization aspects of the Hadoop cluster.
Design and organization of Data Governance, Data Lineage, Data Audit, Data Workflows, Data Discovery, Metadata Management
Hands on code development in Java and Python as well as code reviews and mentoring of the development team
Led the Functional Modelling and participated in the Solution Design and Delivery of Strategic Enterprise-Wide Data Aggregation and Re-Distribution Platform for transmission of OTC Derivatives Business Data loosely based on the industry-wide XML format FPML between the following Business functional areas – OTC Derivatives Front Office, Risk, Finance, Clearing and Settlements.
Led the Functional Modelling of Centralized Stock and Cash Record by consolidating all relevant Sub-Ledgers, including Stock Records, across all asset classes globally onto a single infrastructure. Delivery of significantly improved Trade Capture and Clearing & Settlement event data feeds.
Supported the design of Global Cash and Stock Record Coherence based Applications – Design of Oracle Coherence Distributed Data GRID, Design and Implementation of Indexing Data Structures and Algorithms for fast, in-memory data access, Design and Implementation of highly scalable, low latency, message-based data feeds, event-driven, parallel data processing application modules, distributed access control
Spearheaded the Conceptualization and then Prototyped the use of Social Media Data (especially Google Search Queries time series) based Predictive Analytics for a number of key Economic Indicators. The work was done as part of Innovation Programme within BarCap sourcing proposals for new ideas and solutions.
Leading the documentation of the Solution and Data Architectures and syndicating it with key stakeholders within FO, MO, Settlements, Risk and IT.
Responsible for Project and Work-Stream Management on the programme.
F&C Asset Management Plc
More Work Experience
Less Work Experience
Designed SOA Data Access and Data Management Solution based on SOA Data Services Platform and Master Data Management Infrastructure – the objective was to provide run-time Federation and Virtualisation of the overall Data Model.
Selected, Designed and Extended key SOA and Object Oriented application frameworks providing the overall application structure and supporting streamlined development of custom business logic. Defined the key principles and design patterns for the application frameworks to be used by the development team.
Defined the overall SOA Security Strategy, Security Requirements, Security Capabilities Specification and Conceptual and Technical Security Architecture.
Designed complex Service Compositions involving distributed transactions, business state machines, exception handling, process integrity, security and propagation of credentials and aggregation of business logic
Organised a series of workshops with business and technical leaders to identify and scope the business opportunities, outline potential value and return on investment (ROI) and identify risks and constraints. Proceeded with systematic mapping of business opportunities to business and information architecture, business and technology capabilities and constituent technical solutions.
Specified and facilitated the adoption of Service-Oriented Analysis and Modelling Methodology guiding the decomposition of the Business Domain Models into a set of SOA Services providing the best alignment between IT Assets and the Business. Created the SOA Reference Architecture. Responsible for the alignment between SOA Process, Service and Data Models.
Responsible for coordinating solution work across multiple technical disciplines, business units and national and regional boundaries. Coordinating and overseeing the linkages between the business strategies and application portfolio to ensure the programme remains strategically aligned.
Performed cost analyses and vendor comparisons to ensure cost-effective and efficient operations and measured the feasibility of various approaches. Ensured adherence to the overall business and financial model. Made complex investment recommendations to management based on results of independent assessment of current and future opportunities and risks.
Responsible for analysis of situations and data requiring an in-depth evaluation of multiple factors.