Bates Research | 09-21-17
A Conversation on Big Data in Enforcement: How Firms and Compliance Officers Should Prepare
This week, Bates Research sits down with two of its experts to talk about data analytics and regulatory enforcement, the increasingly complex responsibilities of compliance officers, and what firms should be doing to prepare in this changing technological landscape. Alex Russell is the Director of Institutional and Complex Litigation and the co-leader of Bates Group’s Big Data Analytics segment. He has provided consulting services to both regulators and financial institutions on numerous matters involving allegations of market manipulation in a wide variety of forms (spoofing/layering, and insider trading), as well as assistance on matters in which the stated performance of a manager or strategy is being questioned. Dr. Shane Shook is an independent expert providing critical analysis and reporting in cases involving information management and governance. Dr. Shook has provided consulting and testimony services to exchanges, government agencies, and private clients on issues related to data integrity and trading functions.
Bates Research: Thank you, gentlemen, for joining us today. Lately, the news is filled with stories of regulators developing and using new forms of big data analysis in enforcement. What are some of the steps financial firms should take, given these new investigative efforts?
Alex Russell: Firms first need to understand the risks. They need to be aware of what their data would say about them if a regulator comes looking. When a firm provides data to a regulator, the government will use it, and maybe not for the same purposes that were originally intended. For example, the Market Abuse Unit at the SEC has used blue sheet data provided in unrelated contexts to bring insider trading charges. I read that they are sifting trading data going back 15 years to identify individuals who have made repeated, well-timed trades ahead of corporate news. So firms should consider the possible risks inherent in the data they are providing to regulators.
It’s paramount that firms make sure the existing systems they do have in place are actually providing the right output, and part of that involves understanding the data on a fundamental level. Sticking with the blue sheet example, in the past couple years we’ve seen some pretty big penalties against two major financial institutions for providing incomplete blue sheet data. The misreporting was driven by technology errors, sure, but human error as well for not catching the issue.
Shane Shook: Firms also need a better understanding of what regulators understand “big data” to mean. For regulators, it is the product of an amalgamation of different (and disparate) data sources that are homogenized through data “smoothing” that can introduce inaccuracies in the details. In many cases, we have seen, or helped clients identify for themselves, the illogical associations that software sometimes makes based upon “garbage in/out” processing of big data. So the quality of the software is a big issue. The quality of the data is also an issue. In any big data assessment there must be a data quality assessment that covers the raw data as well as the communication and processing of the data to satisfy the related question of accuracy. Firms must examine the data quality issue in their SRO and other compliance reporting, and critically examine evidence produced by litigants or regulators in related actions. Big data does not mean reliable data, and big data systems are only robotic methods of processing what they are fed.
Bates: It seems that firms might be behind the curve on effective responses to government data analysis. Are the regulators actually ahead of financial institutions in terms of data analytics right now?
Russell: It’s an interesting question. It probably is really a difference in focus: financial institutions are using sophisticated data analytics in market-facing ways, but may not be putting the same level of effort into monitoring and compliance areas within the firm. So for internal examination of the institution itself, it probably is the case that regulators are ahead. The government is certainly putting in the effort, and devoting resources to it, like the five-year, $90 million contract the SEC has with Palantir to identify associated entities and anomalous patterns of interactions, and visualize them for examination focus.
Shook: Palantir is a good example of regulators pushing boundaries. Palantir uses artificial intelligence to help identify possible insider trading. They use it in a “coding challenge” as an interview technique so the coding has been subsequently publicly released and discussed on many sites on the Internet. In short, the model is simply a two-dimensional assessment of trading series information using an AI method to identify problem behaviors. I should point out that there is no “machine learning” involved in this simplistic model, but AI does not always need to be enhanced with ML.
As for the current sophistication of financial firms, every trading strategy and risk management formula utilizes advanced financial engineering (FE) tools, yet compliance functions do not. It could be argued that every FE should rotate through compliance in order to trade skills for experience to help industry catch up with where regulators are actually outpacing them today.
Bates: How immersed in the firm’s technology and data does the firm’s Chief Compliance Officer have to be in today’s regulatory environment?
Russell: It is clear that financial institutions are placing greater reliance on the judgment of the Chief Compliance Officer and compliance professionals. In a recent Bloomberg article examining that question, the reporter asserted that such judgment is increasingly dependent on the ability to handle technological advances like big data, data lakes, machine learning and artificial intelligence. I agree with the conclusion that these advances are combining to revolutionize how companies collect, manage and use data to inform strategy and make business decisions. The point is that the Chief Compliance Officer and compliance teams need to have the ability to operate around data, interpreting it, taking a holistic view of activity across the firm, and not get overwhelmed by data being pushed to them from different systems within the organization.
Shook: Today’s data is not linear (or “sequential” as it used to be termed) as a simple time-series. It is “deep” with consideration of arbitrage on the buy/sell side. Accordingly, it should be perceived as such by compliance officers. The ability to assess the opportunity against the execution of a trade to meet compliance obligations depends upon an understanding of 3-dimensional data analytics – but of course executives have the added necessity to present the results in a 2-dimensional statement to the board and regulators. Without banging the drum too much, compliance officers must take on the responsibility for data quality assessments as well. Data quality is referenced in several standards, but has not yet been specifically detailed. It is, however, appearing as a defense argument to deny charges; therefore it will necessarily become more specific, and compliance officers will find it a part of their mandate.
Bates: Given recent enforcement activity, what are some recommendations for how firms should prepare for the new data-driven enforcement model? Is there anything they can do?
Russell: The right place to start is probably by working on systems integration, so that you have the vantage point you need to make smart decisions. Regulators are going to be looking at your data, and they are not going to be looking at it in a vacuum. They’re going to be combining it with other data sets from within your firm or the markets in general. You’ve got to have the ability to look at activity within your firm at the same degree of visibility.
Shook: You also must be aware of and be prepared to fiercely defend the quality of your data, especially when it contradicts examination questions or litigation evidence. 30(b)(6) electronic evidence discovery is not sufficiently answered in today’s big data environment by the location of data stores or general business process descriptions – a firm should be prepared to defend the collection, aggregation, processing, modeling, use, and dissemination of their data. There is science required in data analytics, not merely engineering.
Bates: How have you been assisting Bates’ clients with these issues?
Russell: We’ve been very active in cases of market manipulation and abnormal performance detection. We’ve also provided guidance on system integrity to clients as well, making sure that what they want their system to be doing/capturing is what is actually happening, or that what they think the data is saying is actually what it is saying – especially in the areas of data quality examination and evidence review (against systems models, software, and data representations of quotes/orders/trades). In instances where the internal compliance and monitoring resources are stretched thin, we’ve also provided comprehensive reviews that included practical templates for the organization to use going forward. For more about our work in this area, you can visit the Big Data Services section of the Bates website.
Bates: Thank you for your insight, gentlemen.