Meeting planners can track more information than ever about attendees. That data can help you understand where your event succeeded and where it may have fallen short, but the information stream often feels like a flood. A smart advance strategy that defines what you need to know can keep you from going under.
Associations have more meetings data at their disposal than ever, which is both a blessing and a curse. App downloads and usage, web visits, beacon metrics, attendee demographics, social media chatter, and more are now at meeting planners’ fingertips. But planners might be wishing for more fingers: Today, the challenge is less about acquiring information on an attendee’s experience than knowing what to do with it.
To that end, associations have begun folding a data-gathering strategy into the other logistical matters that go into meeting planning. That requires making decisions early about what elements of the meeting will be measured and how that information will be used to improve the next one.
“You want to be able to make sure that you have the right technologies in place, so that the by-product—that data that’s being created—allows you to do something informative afterwards,” says Joe Colangelo, CEO of Bear Analytics.
In other words, associations not only are collecting more data but are learning to conduct deeper, more enlightening dives into it. And although they’re deploying some shiny new tech tools, the goal remains old-school: Focus on the experiences that make attendees feel that attending was worth it, and look for ways to expand and reinforce those experiences.
Success of an event is often [defined by] the number of people attending. … But it’s also asking, what does success look like from an analytics standpoint?
A Closer Look
Data isn’t always numbers—sometimes it’s just good information that comes from observation. So earlier this year, the Society for Critical Care Medicine tried an experiment.
The society’s meetings team felt they hadn’t been getting a clear picture of the attendee experience at prior conferences, says former SCCM Business Analyst Karen Boman. Some session rooms were oversized and felt empty, while others had overflow crowds. Speakers were hard to track. Staff learned late about problems at the venue.
So at this year’s conference in San Antonio, SCCM installed cameras throughout the convention center and created a “command center” where staffers—at least two at any time—continuously monitored the activity. “We were able to see what was happening in each of the rooms,” says Boman. “When a session room would start to get full, we would be able to determine that right away, and then we could plan for an overflow.” The command center also allowed staff to confirm that speakers had checked into a ready room and had presentations ready.
Now SCCM is using the information it gathered about session attendance to help plan for upcoming meetings. For instance, it tracked the best-attended sessions by topic and recognized that certain hot topics for attendees, such as sepsis, will require more sessions or larger rooms.
Of course, scrutiny of attendee engagement can be extremely granular, from tracking their preferred activities via a conference app to following every last footfall via beacon technology. A smart association can use that information to learn where attendees are gravitating and to get insight into the interests of various attendee segments.
“What do attendees from small organizations care about? What do medium-sized ones care about? What do CEOs care about? You can actually see what topics are trending up and trending down when you look at that,” says Julie Sciullo, CEO at Association Analytics.
Because there are so many data sources available, association staff should be discussing before the event what they intend to measure. An event that’s built around a tradeshow has different data needs than one built around continuing-education credits.
“If it’s a medical conference, you’re going to want to prioritize understanding what sessions people are going to and [determine] if their profiles are matching up to the event’s education profiles,” says Colangelo. For instance, he recommends tagging sessions by category, so that attendance data can be studied over a period of years to measure the success of particular education tracks.
Learn From Registration
Sciullo says that while there’s plenty of data to be gathered onsite, associations shouldn’t neglect the information collected during the registration process. That information says something about where attendees’ enthusiasm lies, or it may identify a different issue, such as a poor user experience on the event website that becomes an obstacle to sign-ups.
“Whenever you’re doing marketing campaigns, you want to see in real time how that’s affecting your registration,” she says. “One client looked at where people were abandoning the process, and then broke it down and looked at each piece of that. They were able to take their abandon rate from around 3,900 to around 1,000 this year.”
As director of marketing and communication services for the Events Industry Council, Amanda Darvill of SmithBucklin has grown comfortable with studying all manner of quantitative data: open rates on conference marketing emails, app usage, gamification participation, and social media activity down to the last like.
“We’re looking at a lot of data that we’re gathering on a monthly basis and correlating that with registrations,” she says. “When did the last email go out? What was the open rate? Did we have a discount that we included in that, and what was the spike in registration?”
But Darvill cautions against too much reliance on data alone. Beacon data, for instance, might suggest that particular parts of a conference floor were a draw for attendees—or it might have just been one of the few places at the conference center with decent cellphone reception.
To avoid drawing wrong conclusions from data, Darvill interacts directly with attendees and potential attendees both before and after an event. Promotional calls before a Council event double as surveys about the kind of experiences people are looking for; focus groups afterward can yield input that will help the association establish what quantitative data would be most useful to gather at the next conference.
In both cases, she says, knowing your goals for your event helps clarify the questions that should be asked.
“Success of an event is often [defined by] the number of people attending or whether you met a certain budget number,” Darvill says. “But it’s also asking, what does success look like from an analytics standpoint? In that case, having a collaborative effort to make sure you’re gathering the right data gets you the answers you need to tell the story you want to tell to volunteers or the board of directors about the event.”
SCCM’s Boman has two thoughts about how its command center can improve next time around. First, the more cameras the better. Second, make sure the information it gathers on attendees’ and speakers’ activities is integrated more seamlessly into its association management system.
For example: “We work with a third-party vendor for our faculty management, so we want to make sure the data [the vendor collects] syncs up with our AMS system so that we’re able to pull the information correctly and make sure that we’re reporting on our speakers correctly.”
Those issues aside, she’s sure that the close attention the command center provided onsite is essential to the conference’s future. “I don’t think we could go back,” Boman says. “It would be like going into a meeting blind.”