Guest Blog /

Digitizing GDPR In Financial Services – Part 5 – The Transfer of Data

Corix Partners guest blog banner

Welcome to Part 5 of this journey through some of the technical and engineering challenges that companies must address in order to satisfy GDPR requirements.

In the previous parts of this series the key notions of comprehensive and coherent identity + role definition and how they must relate to business processes were explored primarily within the comfort zone of a single homogenous enterprise.

As soon as external supply and distribution chains are added this mix then all of the issues I have explored so far are magnified because all of the entities in the chain must be consistent in their consumption, processing and evidential reporting.

Transfer of data adds two new elements to the mix as well i.e. rationalising the protocols + routes used to transfer data and watermarking the content so that it can be traced along the supply chains and any leakages or corruption identified.

The introduction of “barcoding” along the global food supply chain after the BSE crisis and the many forms of tamper evident packaging that have been introduced into consumer products are probably the most tangible examples of the necessity and practicality of a similar approach to data.

Financial Institutions have had to deal with several precursors to GDPR that attempted to establish protections around the handling of “Client Identifying Data” but the sad truth is that most of what they achieved was a patchwork of point solutions for particular key applications.

These implementations were chased by a flotilla of vendors offering various incompatible pseudonymising technologies that are only really used for moving production data into close coupled dev/test environments that have offshore/outsource resources working on them.

If the IT industry is going to get serious about addressing the data movement and management challenges that GDPR poses it is going to have to embed the necessary monitoring and controls deep into future generations of compute, storage and networking platforms.

During the early years of the low latency trading arms race of the late 90’s and early years of the current millennium lots of new instrumentation techniques were devised to understand the performance of the protocols used for trading data interchanges.

In several past lives I have looked at the feasibility of harnessing these network measurement technologies such as span/mirror ports and protocol analysis features such as Netflow to build accurate maps of inter-application communication in a similar fashion to the conversation metadata analysis techniques used by the intelligence services.

This systematic approach will be fundamental to understanding the true workings of complex regulated enterprises and verifying their behaviour rather than relying on the “oral and wiki tribal folklore” of current IT support teams many of whom themselves are transient and contract/offshore based with little or no “esprit de corps”.

A former colleague of mine once described the business aligned IT teams within a major Investment Bank as a “network of terrorist cells” – it is ironic that the same anti-terrorist techniques should now be considered to address the communication behaviours that these development teams produce.

The rise of Software Defined Networking technologies with the resultant fusion of databases, messaging switches and caches also holds out hope that we may see a future generation of “data-aware” infrastructure both on premise and in cloud platforms rather than the rather “protocol-capable” world we have today.

Now it’s time for the regular “exercise for the reader” – this time go and find out whether your corporate network has Netflow monitoring enabled on it and whether anyone has ever used the data it produces in conjunction with the IT asset management platform to map out your enterprise’s data interactions.

Rupert Brown is CTO of The Cyber Consultants. He has an unrivalled track record over 30 years in Banking IT comprising senior Strategic and Operational roles in Frontline Application Architecture, Development and Delivery as well as ground breaking Enterprise Technology Infrastructures. This has also been complemented by similar client facing leadership roles for Information Vendors and Silicon Valley “Unicorns”. He was formerly a Chief Architect at UBS and before that served in senior roles at Bank of America Merrill Lynch, Reuters, Paribas and Morgan Stanley.

This article was first published on Linkedin Pulse on 21st February 2017 and can be found here

The opinions expressed by guest bloggers are their views and do not necessarily reflect the opinions of Corix Partners.