By Raghu Madabushi, Director of Pathfinding & Incubation at National Grid Partners

At National Grid Partners, we invest in Decarbonization, Decentralization, and Digitization of the broader energy infrastructure. We back founder-centric, tech-differentiated startups that have the potential to be category-creators.

I was introduced to Ameesh Divatia, the CEO of Baffle, Inc., by Engineering Capital’s Ashmeet Sidana. When Ashmeet described the product principles, my first reaction was: Why didn’t somebody think about this earlier? My second reaction was: Is the engineering underlying the product even possible to pull off?

As Ameesh described the underlying concept of the Baffle solution, I realized it was both elegant and, more importantly, checked off several enterprise requirements. Think banks and financial institutions, industrial conglomerates with vast digital asset IoT infrastructure, and even highly regulated critical infrastructure like energy and utilities.

Vast troves of extremely valuable data (especially customer and business sensitive data) reside in databases, and a significant portion of these reside on-prem or in private clouds. Given the push to adopt public cloud infrastructure, and with enterprises reinventing their data stacks to include a data lake, data warehouse, or a combination thereof (lake house?), cloud migration can become a tricky, if not a scary, proposition from a security point of view.

Increasingly, an organization’s data infrastructure is not just likely to be multi-cloud and hybrid (spanning on-prem and cloud), but may also involve microservices, serverless functions, containers, and API gateways. Securing sensitive data in such a diverse and distributed environment is a tremendous challenge.

Data breaches (and what happens after ), encryption, and cloud migration

It’s no wonder 451 Research calls out three top challenges: Data protection and security; governance and compliance; and migrating workloads to new IT environments.

It might seem that a simple solution would be to encrypt the data, both at rest and during transit. But data also has a “supply chain” and typically needs to be accessed by various entities with a varying set of privileges, so the risk of data leakage is higher in such “multi-party” environments. The changing characteristics of a “work” environment — with remote and distributed access, and no perimeter as such to defend — makes responding to a data breach a nightmare.

Other industry research indicates that an average data breach costs $3.86M and takes roughly 280 days to identify and contain. As such, modern cybersecurity principles have evolved to a “zero-trust regime,” where it’s assumed most firms will experience a data breach. Question is, what to do next?

What about analytics?

Another issue is that the overhead of encryption, decryption, and key management impedes adoption of new security solutions. Even if certain use cases tolerate the need to decrypt data (predominantly to perform analytics), decryption makes that data extremely vulnerable to attacks. It’s like leaving the door to a cash-filled vault open while you replace the lock.

This long-standing issue can likely be addressed by encryption-in-use techniques (say, homomorphic encryption) that can enable operations and analytics to be performed on such encrypted data. The problem is, this works best in terribly non-performant use cases that are slow and/or require a full data re-architecture.

Application encryption also leads to the need to modify the app source code — just not a viable proposition. And, while we’re at it, let’s not forget the privacy angle: GDPR, CCPA, and other regulations can result in material consequences if customer data is breached.

Does all this mean one has to choose between encryption and ease of use; between encryption and analytics; between encryption and sharing; or between encryption and performance?

No, thanks to Baffle.

Choice without compromise

Baffle provides advanced encryption to make cloud databases breach resistant – without sacrificing performance, sharing, or ease of use.

Ameesh and his team do this by de-identifying sensitive data in the cloud to mitigate breach risk while enabling privacy-preserving analytics:

  • moving structured data to the cloud faster and in an end-to-end-secure fashion;
  • encrypting and decrypting data at the field or record level;
  • not needing any code changes to the application;
  • enabling secure computation among multiple parties on encrypted data without performance degradation.

Baffle works with established cloud and data lake providers, key management software providers, and various business intelligence (BI) software providers. At the same time, it’s applicable for almost all databases.

Using a patented Secure Multiparty Compute (SMPC) approach instead of painfully slow homomorphic technology differentiates Baffle from the competition. Effectively, this means no party can view another’s data, but they all can run aggregate lookups and other operations across organizations.

In such a framework, customers can “hold their own their keys” instead of handing them over to the application or infrastructure provider. This opens up Baffle for use cases not just around cloud migration, but also around multi-cloud Big Data analytics and protected in-cloud analytics. For example, TPCC benchmarks of Baffle’s technology shows only minor performance degradation vs. plain text analytics; this is multiple orders of magnitude better than enclave-based methods, let alone homomorphic techniques. Check out the Baffle’s solutions here.

My investment thesis around Digitization revolves around data infrastructure and operational performance. The next generation of enterprise critical infrastructure will need to build on cloud, data analytics, and cybersecurity on the one hand (“within-the-enterprise infrastructure”); and AI/ML, IoT, and new emerging DLT (Distributed Ledger) technologies on the other hand (“outside-of-the-enterprise infrastructure”).

We’re very excited about what the Baffle team can enable for customers and the value they can drive. The timing is right, and we believe for critical infrastructure that’s quickly digitizing, this will be a catalyst for further acceleration.

Raghu Madabushi is a Director at National Grid Partners investing in early stage companies in the broad enterprise software vertical. He has 20+ years of experience with technology, capital markets, and IP/innovation. He previously invested in deep tech and industrial infrastructure at SRI Ventures and GE Ventures; managed a large portfolio of open-source technology projects at the Linux Foundation; and headed early stage startup investing at Intellectual Ventures. Raghu also has held buy-side and sell-side roles at Wall Street firms and brings extensive experience in hardware and software design (Texas Instruments, Intel, Cadence Design Systems). He received an MBA in Finance and Investments from Southern Methodist University and an MS in Computer Engineering from Iowa State University, and he currently serves as a Kauffman Fellow