Your Data Is No Longer a Record. It Is Evidence.

THE STRUCTURAL BREAK

Most Firms Are Solving the Wrong Problem

For most registered investment advisers, data management feels like a maintenance task. Keep the records clean. Run the reconciliations. Pass the exam. That framing made sense in a world where regulators asked whether your controls existed. That world is ending.

The question regulators are asking today is not 'do you have a process?' It is 'can you prove the process worked, produce the evidence on demand, and demonstrate that a human with appropriate authority reviewed it?' Those are not refinements of the old question. They are a different question entirely.

Data has transitioned from a storage function to an accountability infrastructure. Firms that have not rebuilt around that reality are running a 2020 operating model inside a 2026 regulatory environment.

The shift has a name: the Evidence Economy. Trust is now priced through artifacts. Examiners want lineage, audit trails, exception logs, and governance documentation they can pull and interrogate. Performance narratives, policy documents, and verbal explanations of control intent are no longer sufficient. If you cannot produce clean, consistent evidence across systems on demand, you do not have control. You have the idea of one.

THE AI INFLECTION

Every AI Deployment Is a New Data Governance Obligation

Eighty-seven percent of wealth management firms are deploying AI in some operational capacity (Datos Insights, 2026 Wealth Management Trends). Seventy percent have AI in front office production, up from ten percent a single year ago (SimCorp, 2026 InvestOps Report). That statistic should concentrate the mind of every COO and CCO in the industry because it means the majority of firms have acquired a new category of accountability exposure without yet restructuring the data architecture to match.

When an AI agent participates in a suitability decision, generates client communication, surfaces a surveillance alert, or executes any operational workflow adjacent to a decision, the regulatory question is not whether the system performed. It is whether the firm can prove it performed, explain why it produced the output it did, and demonstrate that a qualified human exercised meaningful oversight before any consequential action was taken.

That is a data architecture problem before it is anything else. It requires logging that captures inputs, models, and outputs. It requires audit trails that survive model updates. It requires replay capability so that a regulator examining a decision eighteen months after the fact can reconstruct exactly what the system saw, what it concluded, and what the human reviewer did with that conclusion. Most firms do not have this.

EU AI Act, Art. 26: Deployers of high-risk AI systems bear full accountability for compliance, operational use, and human oversight — not the AI provider. Enforceable August 2, 2026.

The EU AI Act makes this concrete. Article 12 requires logging of all high-risk AI system operations. Article 14 mandates human oversight with the functional authority to override, not merely observe. Article 26 places deployer obligations directly on the firm using the system. Vendor terms of service do not transfer this liability. If your firm is deploying AI in a workflow that touches biometrics, credit decisions, employment assessment, or client communications with risk implications, accountability sits with you. Your data infrastructure is the compliance layer.

Fines for violations in high-risk categories reach three percent of global annual revenue or 15 million euros. For prohibited uses, seven percent or 35 million euros. These are not aspirational enforcement numbers. They are enforceable figures attached to a deadline that is now fewer than five months away.

THE OPERATING MODEL FAILURE

The Headcount Response Is a Structural Trap

The investment operations industry has a preferred response to complexity: hire. When T+1 compressed the settlement cycle, sixty-five percent of firms expanded their operational teams rather than investing in technology to absorb the change (SimCorp, 2026 InvestOps Report). When regulatory requirements intensified, most firms added compliance staff. When data volumes grew, most firms added data operations personnel.

This is the headcount trap. It feels like problem solving, but it is actually negative operating leverage. Every hire creates a dependency on individual attention and manual judgment. Every manual judgment creates a gap in the audit trail. Every gap in the audit trail is a surface for exam exposure. And the underlying structural problem, fragmented data infrastructure that cannot produce evidence on demand, grows more expensive to manage as the team expands around it.

65% of firms added operational headcount to manage T+1 rather than investing in technology infrastructure.

63% of investment operations firms lack a unified front-to-back data layer. (SimCorp, 2026 InvestOps Report)

4.9% compensation cost growth versus 4.1% revenue per FTE growth — the math of the headcount trap. (Datos Insights, 2026 Wealth Management Trends)

The firms solving this problem are not the ones with the most staff. They are the ones that identified data infrastructure as a strategic function and built operating models that produce automatically, not reactively. Clean data. Reconciled systems. Structured audit artifacts that exist before the examiner asks for them, not because someone assembled them under deadline pressure after the request arrived.

WHERE FIRMS ACTUALLY BREAK DOWN

The Gaps That Show Up Under Examination Pressure

Across the investment adviser landscape, the data management failures that create exam exposure are consistent regardless of firm size or strategy. They are worth naming precisely because they rarely announce themselves until the cost of fixing them is highest.

Vendor Accountability Without Vendor Evidence

Amended Regulation S-P makes clear that investment advisers are responsible for the data security practices of their service providers. Most firms manage this with attestations and questionnaires. The SEC examination standard has moved past that. Examiners want evidence of active oversight: documented security reviews, contractual notification obligations with specific timelines, and confirmation that vendors can operationally meet the firm's regulatory commitments in a breach scenario.

The 72-hour notification requirement under Regulation S-P is the critical test. Meeting it requires knowing what data you hold, where it lives, who has access, and whether any of it was compromised. Firms that lack a current data inventory cannot answer those questions in 72 hours. That is not a vendor problem. It is a data architecture problem.

AI Governance Principles Without AI Governance Operations

Most firms that have deployed AI have also produced AI governance policies. Most of those policies describe how AI should be used, who has oversight responsibility, and what ethical principles apply. Almost none of them constitute a functioning governance operating model.

Documentation is not control. A policy that says a human will review AI outputs is not equivalent to an audit trail that demonstrates a specific human reviewed a specific output, made a specific determination, and that determination is timestamped and preserved. The gap between having principles and having governance operations is the gap between exam readiness and exam exposure.

Fragmented Systems, Fragmented Evidence

The average investment adviser firm has added technology incrementally across a decade: a portfolio accounting system, a CRM, a compliance tracking tool, and a risk platform. None of these systems were designed for interoperability. Data lives in silos. Reconciliation requires manual intervention. Reports require human assembly before they can be trusted.

Every manual step in a data workflow is a gap in the evidence chain. Examiners examining the consistency between what a firm's systems contain and what its Form ADV, Form CRS, and marketing materials represent are looking at exactly those gaps. Discrepancies that trace to fragmented data infrastructure are increasingly treated not as administrative errors but as governance failures.

WHAT THE STANDARD ACTUALLY REQUIRES

Examination Readiness in the Evidence Economy

The SEC's updated Examination Manual, revised for the first time since 2017, reflects an enforcement posture that has become a production system. The Wells process now operates on a four-week submission timeline followed by a four-week meeting with senior leadership. Settlement and waiver considerations run simultaneously. The operational implication is that firms facing examination cannot afford to spend the early weeks assembling their evidence. They need it to already exist.

Examination readiness in 2026 has four core requirements:

  • A current, complete data inventory that maps what data the firm holds, where it lives, who owns it, retention requirements, and which third parties have access.
  • Continuous reconciliation across systems, not batch review. Controls built for after the fact documentation are stretched past their design limits in an environment where settlement cycles are compressed and AI generates decisions in real time.
  • An AI governance operating model, not a policy document. This means structured logging, timestamped human review records, replay capability, and documented change control for every model version that touches a workflow touching clients or investment decisions.
  • Active vendor oversight with contractual specificity. Not attestations. Documented reviews assessed notification procedures, and contractual obligations that mirror the firm's regulatory commitments timeline by timeline.

If the SEC showed up tomorrow, could you prove without explanation that your controls worked end to end? Or would you need the time you assumed you had?

THE OPERATING MODEL RESOLUTION

Data Infrastructure as Strategic Function

The firms that are pulling ahead on this are not the ones with the newest technology stack. They are the ones that made a strategic decision to treat data infrastructure as a core competency rather than a cost center, and built operating models where evidence production is continuous, not reactive.

That model has specific characteristics. Reconciliations that run automatically and surface exceptions before they reach any output seen by clients. Data governance that is embedded in daily operational workflows, not performed as a periodic compliance exercise. Audit artifacts that accumulate in structured form so that any examiner question is answered by pulling a record, not convening a team to assemble one.

Managed services are rising in this environment not because firms are capitulating operational ownership, but because the most effective approach to continuous evidence production combines agentic execution with deep domain expertise and governed data stewardship. The goal is not fewer people. It is different people doing different work: orchestrating systems rather than running them, reviewing exceptions rather than finding them, exercising judgment in places where judgment is genuinely required.

STP Investment Services works inside the operational workflows of registered investment advisers, private fund managers, and institutional asset managers. Our managed services, compliance support, and agentic operating model are built around one outcome: data that is clean, reconciled, and ready to evidence before anyone asks for it. Because in 2026, the firms that win examinations and institutional due diligence will be the ones that could have answered the question before it was asked.

THE QUESTION WORTH ASKING

If a regulator asked you today to produce auditable evidence of control effectiveness across a key operational process, end to end — how long would it take?

The answer to that question is the most honest assessment of your firm's data infrastructure you will get without having an examiner in the room.

 

REGULATORY & DATA REFERENCE

Reg S-P (Amended): 72-hour breach notification. Written incident response programs. Active vendor oversight. Effective 2024.

EU AI Act Art. 12: Logging of all high-risk AI system operations. Inputs, outputs, and model version documentation.

EU AI Act Art. 14: Human oversight must be functional, not nominal. Override authority required.

EU AI Act Art. 26: Deployer bears full accountability. Vendor terms do not transfer regulatory liability. Enforceable August 2, 2026.

SEC Exam Manual (2/24/26): First update since 2017. Wells process: 4-week submission, 4-week leadership meeting. Enforcement is now a production system.

Marketing Rule: Performance claims require substantiation. Documentation of AI assisted content is an emerging examination surface.

 

SOURCES

SimCorp 2026 InvestOps Report — Statistics cited: 70% of firms with AI in front office production (up from 10% in 2025); 65% of firms expanded operational teams for T+1 rather than investing in technology; 63% of investment operations firms lack a unified front to back data layer.

Datos Insights, 2026 Wealth Management Trends — Statistics cited: 87% of wealth management firms deploying AI; 4.9% compensation cost growth versus 4.1% revenue per FTE growth.

EU AI Act (Regulation 2024/1689) Articles 12, 14, and 26. Fine thresholds and enforcement date. eur-lex.europa.eu/legal content/EN/TXT/?uri=CELEX:32024R1689

SEC Amended Regulation S-P (2024) 72-hour breach notification requirement; written incident response program; active vendor oversight obligations. sec.gov/rules-regulations/2024/05/ia-6639

SEC Division of Examinations, Updated Examination Manual (February 24, 2026) First update since 2017. Wells process timeline, simultaneous settlement, and waiver consideration framework.