As I transition slowly out of Placemeter / Netgear, I’m exploring a variety of other ideas along with working again with some of my favorite friends.
Starting early March, I’ll spend a good portion of my week helping Nathan and the rest of the TradeIt team on a variety of subjects. As part of my re-education on the fintech space (I dabbled in it a bit during the Gilt City days), I’m trying to attend as many events on the space as possible.
Tech:NYC and TradeIt hosted a Chatham House Rule session for leaders in the space yesterday, from banks to aggregators, while I can’t attribute comments or name names, my observations on this hot spacer are included below.
Back in 1997, as pointed out by Nate, the debate raged first when Microsoft & Intuit were creating a standard “OFX”. But, mid-2016, Jamie Dimon positioned himself as the consumer advocate for banking data. A very unlikely position that mostly betrays the growing frustration between:
- aggregators (think Yodlee),
- application developers that build products on top of these aggregators (think Mint or Current),
- and the data originators, mostly the large commercial banks
The consumer is, as usual, taken hostage on a debate that centers around data ownership, value chain capture and innovation. Originally, the rise of the aggregators was prompted by the inability or lack of desire of banks to build the infrastructure allowing other companies to plug themselves into their pipes. You could compare it to the early days of Twitter when a lot of the firehose and respective data were handled by Gnip and Datasift.
Banks are getting smarter and understand they are letting a big part of the value (and ultimately could become entirely desintermediated) on the table by not owning that part of the chain.
The CFPB (who is at risk of being dead pretty soon given the Trump administration current stance) issued an RFI seeking input about data aggregation. The RFI is here and you can read on page 5 the list of 17 questions they are trying to get answers on.
Having recently spent a good amount of my time on civic tech, gov tech and open data, this reminds me of the tension that still exist between cities, citizens and a tech world who is trying to build an application layer on top of the data produced by administration. The pendulum swung wildly in favor of more transparency during the Obama years but, just like the below tweet illustrate, the pendulum can go back pretty fast on the other extreme.
— IQuantNY (@IQuantNY) February 15, 2017
My initial reaction is to think that as a consumer, as the one who initially “created” the data that is collected by my bank (bought X, wired money to Y, withdrew Z $$$), I should be empowered to use my data in any way that I see fit, as long as I’m the one approving that use, and that the data ingestion is secure etc. But this covers many more issues that need to be fleshed out:
- what is informed consent from a consumer perspective? (i.e. how often to you read in details what info you give and not give to an aggregator or a developer),
- what data are you granting and shouldn’t we have different treatment and different consent layers for different type of data (PII vs transactions level vs aggregated etc etc),
- what is the role of the regulator on this and how much role would we like him/her to play vs. agreeing at industry level (define industry though) on a set of guidelines,
- and the list could go on and on…
Another lingering question is, as part of the overhaul of Dodd-Frank, what will end up happening to the section 1033, the one protecting consumers. While it’s most likely not going to be dismantled, it could 1. be severely limited to the benefit of the large banking players and 2. it could start shifting some of the data responsibility away from these same banks. To be followed.
But I can’t help smiling when Dimon takes my defense as a consumer while, on the other side, pushing like crazy to dismantle Dodd – Frank and any regulation limiting his company…