As the ability to personalize member experiences matures, we’re going to see many conversations about how organizations use member data in 2018—conversations being brought on in part by the European Union’s forthcoming GDPR data management requirements. An upfront strategy will do your organization well.
Here’s a conundrum for the new year.
A recent study by Community Brands offers evidence that personalization is what many association members now expect from their organization. On the other hand, the public—which has been heavily targeted by personalization in online marketing over the past decade—seems to be growing less comfortable with what the technology can actually do.
Example: Last month, Netflix referenced a data point on Twitter about its holiday movie A Christmas Prince. The point was played for laughs, but it created a conversation about just how much data the company has on its millions of users. Even though the data point exposed no personal information, the fact that it was accessible at all made plenty of users uncomfortable.
Another recent case was a to-do over the age-based targeting of job ads on Facebook, which ProPublica noted appeared to violate federal law; its story came amid the filing of a class-action lawsuit. (Facebook disputes the nonprofit news outlet’s claims, comparing the recruitment tactic to advertising a job in a niche magazine.)
Are You Inferring Too Much?
These two cases, in different ways, highlight a broader issue: Personalization is great for boosting engagement—and there’s a lot to be learned in your data—but it often raises ethical concerns if done carelessly.
In a recent Slate piece, Rena Coen, a policy analyst at Berkeley Law, noted that many Americans are uncomfortable with inferences made about them based on data—particularly inferences about personal information like race or household income—and that such inferences, in general, are shaky ground.
Coen, who coauthored a paper [PDF] on algorithmic personalization with researchers at both Berkeley and the Center for Democracy & Technology, says roughly two-thirds of respondents thought inferring race or income level was an unacceptable or somewhat unacceptable practice.
“Our study suggests that the industry needs new standards to better inform and empower people to protect their personal information,” she wrote in her Slate piece. “This requires not only allowing users control over what data are collected about them but also what is inferred about them, how those inferences are used, and by whom.”
Certainly, a wide base of data has its benefits and is a foundation of plenty of great personalization strategies at many associations. But with these kinds of ethical debates happening out in the open, you should consider these questions at your organization.
Ethics, Data, and GDPR
What’s driving this conversation? In part, much larger entities than associations, such as Facebook and YouTube, are facing questions about both their ad targeting and what content sits next to what ad—helping to bring to light a wider ethics discussion that is only now happening in earnest.
But the issue is especially urgent now because of the looming implementation of GDPR, the European Union’s General Data Protection Regulation, which takes effect in May.
The rule, which will restrict how organizations collect and use personal information about citizens of EU nations, applies regardless of whether an organization has a base of operations in Europe. Previously, EU data privacy rules affected only companies with a physical base in an EU nation.
GDPR gives individuals the right to access the personal data that an organization has, the right to be forgotten (or to have their data erased), and the ability to receive their data in a way that can be easily transmitted to another source. The regulation also mandates that systems be designed with a focus on privacy and that organizations have a designated on-staff data protection officer who will be in charge of internal record keeping.
And while GDPR will require some potentially painful changes in organizations’ data management practices—a big topic at ASAE’s 2017 Technology Conference & Expo last month and the subject of a separate ASAE program in February—it’s also going to help set the rules of a road that’s full of potholes.
As technology has opened new directions for how data can be mashed together in aggregate and pulled apart with surgical precision, our rule-making simply hasn’t kept up. GDPR will not be the only way such data will be regulated in the future, but it is the one getting the most buzz now, and associations need to be listening.
All of this isn’t to say that you can’t use data in creative and thoughtful ways. For example, the Association of Chartered Certified Accountants, to celebrate a milestone, put the names of its 200,000 members on a dedicated wall and encouraged them to find their names. That’s a great example of using data thoughtfully.
But with the power granted by the information your organization has, you have to be careful not to overstep your bounds. The trust your association hopes to maintain depends on it.