Mike Petroff and Aaron Baker have teamed up to create a custom editorial analytics dashboard called “Scoop,” which they use to serve the content teams at Harvard University and the online version of The Harvard Gazette. Their goal? To give power back to the content creators, using data to answer their questions and help make informed decisions about content. Instead of just number crunching, they’re able to use data to tell a cohesive story about content performance and get to the questions that really matter, like “Is this content really working?” and “What do our users actually need?”
Mike Petroff is a senior product manager at Harvard Business Publishing. He manages the ongoing development of the Higher Education website with a focus on acquisition and engagement.
In his previous role as Director of Content Strategy at Harvard University, Mike led efforts to bring Harvard’s digital content to the world in innovative and effective ways. His team produced, distributed, and measured multi-channel content for flagship properties including Harvard.edu, the Harvard Gazette news website, email newsletters, and evolving social media and mobile platforms.
Outside of Harvard, Mike regularly presents at conferences including Confab, SXSW Interactive, ContentEd London, and CASE on content strategy, social media, and analytics.
Kristina: Hello again. Welcome to The Content Strategy Podcast. I’m your host, Kristina Halvorson. This podcast is brought to you by ContentStrategy.com, and Brain Traffic, a content strategy consultancy. Find out more about Brain Traffic at BrainTraffic.com.
---
Kristina: Hey everybody, welcome back. Thank you so much for joining me once again on The Content Strategy Podcast. Today, I have two ... a twofer, two people here on the podcast to speak to, and I would like to introduce you virtually to Mike Petroff and Aaron Baker. I’m going to read a little bit about them.
Mike is a senior product manager at Harvard Business Publishing, where he manages the ongoing development of the higher education website used by a global audience of business students and educators. He defines the roadmap to advance customer acquisition, search and discovery, and user engagement. No pressure, Mike.
Aaron is the associate director of content strategy for Harvard University Public Affairs and Communications, where he is dedicated to measuring the impact of digital communications. He also serves as a data evangelist, helping writers, editors, and content distributors get a sense of how content engagement metrics can inform their work.
Mike and Aaron, welcome.
Mike: Thank you so much, happy to be here.
Aaron: Hello.
Kristina: Mike and Aaron, you guys came and spoke at Confab this year, in May. You were two of our most highly rated speakers. Congratulations.
Mike: Woo hoo.
Aaron: Yeah, thank you.
Kristina: And it’s not just because you’re awesome, but it is also because your topic was amazing. I was super excited to have you guys come and speak because you talked about the development of what you called a dynamic editorial analytics dashboard and data warehouse called “Scoop.”
Tell me a little bit about what this is. First of all, you’re working ... in fact, you know what? I’m going to back up for just a second because I really want to actually hear about how both of you came to your current roles in terms of working with and understanding the role of content within marketing and the user experience. Mike, do you want to start? Tell me how you got here.
Mike: Yeah, sure. I’ve been at Harvard for about seven years, but I’ve been in higher education for well over a decade now. My background is actually in documentary film production, imagine that. That’s what I went to college for. I loved creating non-fiction content, if you think of it that way. That kind of led me into a world of higher education, doing a lot of storytelling, trying to figure out how to best lead people to where they actually wanted to find things at our websites, through marketing channels and other means. That’s kind of my background into content strategy, sort of that merger between storytelling and emotion, and also giving people what they want when they want it.
Kristina: You know, I am continually amazed, and I say this almost every single episode. When people are talking about their journey to content strategy, they kind of say what they majored in or where they started out, and they’re always just like, “If you can believe that, or if you can imagine that.” And I’m just like, “Welcome to the world of content strategy, where we’re all kind of still figuring out this job that did not exist when we were in college.”
Mike: All misfit toys, yep.
Kristina: Exactly. That’s awesome. Aaron, tell me about you.
Aaron: Also, a higher ed veteran, I guess. I’ve been in higher ed for 16 years now, and Harvard for the last four. Primarily on websites, website redesigns, and it was a website redesign we were working on in 2005, and we had just sort of formed this motley crew of a web team from different silos across the university. At that time, I think we were all reading your book actually. We were like “We need this role called content strategy.” Then throughout my time, I was at UALR. Then I moved into analytics, sort of broadly. Then, Mike hired me at Harvard. We started working together on an analytics dashboard just for the news office, The Harvard Gazette. That’s how I came to content strategy. I’m in the content strategy team now. We’ve just been working together on refining what metrics matter to content creators.
Kristina: Okay I’m so excited to dig into this because that is exactly what metrics matter, not only to content creators of course, but to our organizations’ bottom line and to understanding whether or not our users are truly being satisfied or engaged or having their needs met by this content.
Tell me about … so this first analytics dashboard that you both sat down and you were like “Okay, we have a set of challenges or questions or problems. We need to develop some kind of product that will help us address those questions.” Can you tell me a little bit about how you backed into “we need this dashboard.”
Aaron: I can speak to part of this. Then I’m going to ask Mike to fill in the gaps. For me it was a matter of working smarter and not harder. As a newbie analyst on this team I was doing a lot of manual reporting, a lot of logging in, copying numbers, pasting them to a spreadsheet, sending out an email. It was mind numbing. It didn’t give me any time for what would be more helpful analysis, it was just sort of number crunching.
So I started running, building these scripts to automate some of these workflows. What’s the open rate for today’s email? How many clicks went through to this story? How many page views did this story get? I sort of approached that and said “Mike, I want to work on this on a level of where I can automate it so I can make my day go by faster.” Then Mike being in the product mindset I think, started to think “Whoa, if you’re doing all this work let’s just take it one step further and actually publish it to the people who need to see it.”
Mike: Then I think, from my perspective, where I was working very closely on a strategic side with the entire communication’s office, that included media relations, and the Harvard news team, folks running Harvard.edu and the social teams. Generally, there was this fear that measurement meant that we were going to essentially grade them on performance. There’s this idea of page views kind of being a touchy metric when writers are writing for specific beats. They might worry that if they’re not getting the amount of page views that other writers are getting covering science then they might be, get negative performance evaluations.
We wanted to come at it with an approach that we wanted to measure the content for what it was meant to be and do. I think in working with Aaron this aligned with his idea of trying to be smarter with the data collection so we could really open ourselves up to ask those questions from a strategic standpoint and, as Aaron mentioned in our presentation, create a true measurement plan. We got away from the marketing of the content and got more into “What are we actually trying to measure once a user is here interacting with our news content from The Gazette?”
Kristina: I’m super interested to talk a little bit about how you were able then to internally tee up this project so that to your point it wasn’t like, “We need to be better tracking how well you’re performing and really putting you on the line, writer, content creator, videographer, whomever.”
Talk to me about how you positioned “We’re going to open up this opportunity to start to gather data that will help us inform not only how content that we have been creating is working but what kinds of content we should be creating to serve user needs?” Tell me how you opened that conversation up. Then about how you went about analyzing current state and what problems needed to be solved. I want you to do it all in thirty seconds. Okay, that’s not true.
Mike: I’ll tee it off then Aaron, hand it to you. I think from our point of view, there was just a lack of awareness of what potentially could be measured for the website itself from the team. If they wanted to measure how long people spent on the site, we were unpacking that and saying like “What are you really trying to understand because time on site is just a calculation. We can try to measure whatever you actually want to get at.” I think it was important to start there and ask the question one level beyond what they were trying to get at so you weren’t just delivering an Excel sheet with that metric. You were actually trying to ask those questions up front.
Aaron: I think to follow up with that, those early conversations were a little bit tricky. We lacked a shared vocabulary about metrics and data points. I think there was a suspicion towards data in general which we had to ... It was a challenge, and that really ... It showed itself in really weird ways too. It’s not that they didn’t believe me when I told them a story had X number of page views or whatever. It was more when we started to get into data decision making. That’s where people got really suspicious.
It’s like “We need to AB test the subject line of this email. Write two subject lines and the computer, or whatever, the algorithm will pick the one that’s performing better.” They just did not want to do that at all. We had to start back over and say, like Mike said, “What data points will help you make decisions?” We stopped using language like “We want to be data-driven.” And started using language that was more, “We want to be data informed.” I think, when you give the power back to the content creators, we weren’t really looking to optimize everything. It was more like, “Hey we’re going to try to surface this data for you. You tell us what we want to know.”
Kristina: Tell me a little bit about, what do you think that they were afraid of? You said that they were suspicious and afraid. The reason I want to talk a little bit more about the people side of it, and not just “Tell me what you were measuring. Tell me what was included in the dashboard,” which we’ll get to, but that I think that any time change is introduced, even if it is, “This is going to help you do your job better.” Or, “This is going to help, as a whole, the content or the product that we’re putting out be stronger and better” and that people are so ... I don’t want to use the word precious because that’s demeaning, but you know, they’re really territorial about it. Or they’re emotional about it. They’re just like, “You don’t understand my audience. You don’t understand why I’m creating this content. I need you to just get out of my way so that I can do my work.”
Aaron: I think that part of the answer there is that there was little understanding of what data could do for them. And I think that I had, on the other side, I had some ethos building to do myself. So, I was a new ... you know, my position was pretty new. They sort of refactored it to be this analytics lead. You know, my first three months here they called me the data guy. Now, I think that call me that lovingly, but at first there was a suspicion of like, “Well, we’re a comms office. What are you doing here? What can you tell us?” And there’s also this suspicion of like, “So what? You’ve given me all this data regarding the things I’ve produced. What am I supposed to do with it?”
And I think that’s an initial reaction based on not seeing it. So, I just, I think all of that was very justified, and in my own ethos building I had to start by saying, “Well look, I have a background in writing too. I have an undergraduate degree in journalism, and a graduate degree in technical writing.” I was like, “I’m a writer too.” So, when I talk about metrics, I ... So, I went down that path with the writing and editorial staff. I think that sort of helped.
Then, when we started to produce sample reports, you know, we started to introduce metrics that we hadn’t collected before like scroll depth, and we fixed time on page to be engaged time on page. I think that’s when we started, little by little through separate conversations, we started to show, “Look, this is what we’re actually talking about. We’re not talking about something that’s going to change who you are, or what you write about. We’re just going to give you data that tells you what you can now expect.”
Mike: And I think also understanding from the team where they assumed that data would create some difficult conversations, especially what we call the democratization of data, but sort of visibility of data everywhere. There was, we’re working in an environment where we were sending out reports to sort of curated lists of people, so they could make decisions, strategic decisions, on what to tackle next. But if we were going to start sharing that out widely to everyone then, you know, not only with us but with their managers, with their other content creators; we’d have to have these sticky conversations on why did this do this? Or why did this perform this way?
I think it also opened up some questions around the marketing of content that we had too, since those teams were aligned. You know, questions like why wasn’t this featured on the homepage? Why didn’t this go on an email? What did we do with social media on this? So, the content writers were now more interested in marketing than they were ever before.
So, I think that’s where some of the early conversations, we wanted to just try to get some visibility to that, and be very mindful of that when we started even down to like the UI design of the dashboard. And trying to not be value-driven on sort of these metrics, but just informative. So, it was really trying to understand where these pain points were early in the process, and not looking at the dashboard and how to use it. But, more of the interactions that we’d create between staff members as we’d open up the data to everyone.
Kristina: You know, I think that that’s fantastic because I think a lot of ... you know, a conversation I have a lot of with clients is they want to know, “How do we know if we’re measuring the right things?” A lot of times what happens is they’ll show me these reports and it is, it’s just numbers upon numbers, upon numbers; versus, this is information around which we’re going to have a conversation. It’s not the number, the number is input to the larger conversation, the decision-making process.
And it’s not necessarily seen as that. It’s seen as just this thing that is ... either it was good or it was bad, and that’s what you’re finding.
Aaron: Totally, I totally agree, that’s what we were facing too.
Kristina: Okay, so tell me about the thing. Tell me about Scoop.
Aaron: Well, what do you want to know?
Kristina: I want to know everything.
Mike: I know.
Kristina: Start at the beginning.
Mike: Can I start with the name actually? How we came up with the name because we didn’t get to tell its story. So, very quickly, we were trying to think of a name for this, especially because we wanted it to be memorable. We didn’t want it to be just another dashboard. Aaron was very smart early where he said everything should just be called this thing, even if it’s not the thing. But every data point, every thing that we share out.
So, Scoop comes from the idea of it being focused on journalism. So, the idea of like getting the scoop, or what’s the scoop? But also, sort of scooping for data and looking for broader insights. But then, I come to find out that the managing editor of The Gazette, her nickname in, I believe, high school was “Scoop.” So, we found this out after we came up with the name and she loved it.
So as soon as I heard that I said, “We’ve got to go with Scoop. That’s the name for this thing.”
Kristina: That is good stuff. Then, of course were you just like, “We named it after you.”
Mike: Every day.
Kristina: Yeah, for sure. Exactly. That’s awesome. Okay. So, you decided then, you’ve got data all over the place. You want to start making it contextual. You want to start using it to make better decisions. You don’t want it to just be like a report back on what’s happening. Talk to me about what happened next.
Aaron: Well, the way we sort of started thinking about it was we wanted to find a way to stitch together data points in order to tell a cohesive story about performance. So, if you write a story, and then that story goes into our daily email—we send an email to almost 300,000 people every week day. We also might feature that same story or parts of it in our Facebook feed. We have five million followers on Facebook.
So, all of these distribution points, you know, funnel traffic back to the story. So, we wanted to see if there was a way that we could take data from these different systems, like Facebook and our email service provider, and then we’re using Google Analytics on the new site itself. We wanted to stitch those together.
That’s how I started approaching the idea, was to just see if I could marry up the tables of data with a unique identifier. In this case, the URL.
Kristina: Mike, you want to add to that?
Mike: Yeah, sure. So, the other person that’s, I would say like a third, a third, a third into the creation of this is our development partner Lin. So, she was key in helping both me and Aaron in this task of what is this thing going to be, and how’s it going to perform? We knew we had data sources. We knew we needed tables to put them in, and then, we all kind of decided as a group after a white boarding session, the three of us to say, “We all know WordPress pretty well. WordPress has a front end and a backend database. Let’s just start putting stuff into WordPress to see if we could match posts in WordPress with posts as they’re published in The Gazette website, which is also on WordPress.”
So, I think we just very quickly wanted to get to some sort of MVP or minimal viable product of a dashboard or a report. What we slowly discovered is that we could work and iterate very quickly, and try things in this to show people what data we could get at each story. Whether that was listing out the top stories of the week, and the amount of page views and engaged time they had. Or, if it was more granular of sort of the top emails that we had sent over the course of the month, and what the opening click rate was.
There’s never been one place where all of this stuff lived, and on top of that, we’ve never given the ability for people to actually engage with it. But I think that was key is we used to deliver sort of static, flat reports. Even if they looked great, that was it. But in this we could start to put it in front of them. I still remember giving it to the two editors, the first version we had, and just seeing how they interacted with it, and what they wanted next.
Sort of they would just give us, “Oh, it would be cool if ... Wouldn’t be great if ...” And then that became essentially our prioritized backlog for the product.
Kristina: See, that’s really interesting to me because I will say one of the most kind of eye-opening client sessions I’ve ever sat in, I was asked to sit in where this huge super expensive analytics company would come in every month and give them this giant readout of how their website was performing, their content marketing was performing.
What I sat and watched for an hour was literally ... it was like, “We have 8,000 numbers and we’re going to put them in a bunch of graphs. We’re going to compare them to industry standards. We’re going to ...” It was literally like a show and tell with different, “Now here’s the same information painted a different way. Now, here’s the same information in a bar graph.” And it made me crazy because I was just like, “None of this is meaningful. This is basically like a thumbs up, thumbs down idea around content.”
So, how fantastic that you were able to show this content, and then step back and say, “Okay, how is this? Is this meaningful to you? What else would you like to know? What else would help you be more confident in making decisions about what you were going to publish, or who you were going to deliver it to?” So, how great that you were kind of able to do on-the-spot user research with the people you were delivering this data to.
Mike: And I think you have to ... the lesson we learned quickly is you have to get your hands dirty early. You can’t sort of step back and take six months to develop this beautiful new thing that you’ll reveal because you have to have ... what I’ve learned is you have to have many moments to reveal things for it to like click for someone. Where if you reveal the thing, that’s your one shot, and if they’re not using it then they’ll never go back to it.
But it just felt like Scoop became sort of this new member of the family for us, as we would start building things and, Aaron was even sort of releasing mini updates along the way each month or two. He would have a new thing added, like the mobile breakdown, or cohort breakdown, and audiences in the email. All the sudden, people would perk up and say, “Oh, I’ve always wanted that. That’s so cool. I’m going to use that all the time now.”
So, it became a living, breathing thing. It wasn’t just a flat report.
Kristina: And how ... who was collecting the data? I mean, you said it was data points coming from a bunch of different places. It was, you had WordPress analytics. What else were you pulling in?
Aaron: So, we’ve got data from the story itself in Google Analytics. Then, if it’s distributed in an email, we’ve got email mailing by mailing analytics. So, how many opens did the mailing get, and then how many clicks did each story get in the mailing? We’ve got similar stats from if something is shared on Facebook, you know reach, engaged reach, impressions and likes and reactions, and all of that.
I think the point that ... it was that “so what” point, and the how do I use this information that got us thinking about how to do this differently. For one thing, we didn’t want to just give everybody, you know, logins to all of these systems and be like, “Yeah, just go find your own data.” Because that just sounds like a disaster to me.
Instead, we wanted to make it contextual, and we also wanted to benchmark it because the question I was getting asked all the time is “How did this do?” And I would reply with a metric, and then, they would ask, “Is that good or bad?” I was like, okay yeah, all right. I got it. I got it. I can’t just tell you how it did when you ask me. I have to give context on ...
So, that’s where Scoop really shines in a way that none of those tools can on their own, is that we collect data. And it’s just in the way that we collect it from Google Analytics. We know when a story is born. We know when it’s published. We have that timestamp in WordPress.
We can look at a list of stories as they’re published, and then ask Google Analytics for data per story based on the date that’s published. The reason we do that is so that we can set up a benchmark of how many page views every story gets on each day after it’s born. So, if the story is one day old, we can expect it to have a couple of hundred page views. If that story is two days old, now we have an average for every story. What do we expect for a story that’s two days old? Then so on, and so on, and so on.
So, we do this for 30 days. What we found when we did an initial discovery was like after two weeks a story, a typical story I’ll say, is going to get about as much traffic as it’s ever going to get. We run a fast-paced news organization. We publish 70 to 80 stories a month. Once it’s out there, you know it has this like two-week lifespan. Then, there’s this long tail. So, we just run it to build a system to show that.
Like what was the makeup of this traffic for that initial period after being published? Did more people come from the email, or from social, or direct? Was there a spike in referral traffic because the story got picked up somewhere? All of these interesting like tidbits about the story’s life starts to come alive when you sort of look at traffic, web traffic, through the lens of distribution.
Mike: If I could just add to that because I think that’s a really good point, and also just to add to where this fit, where Scoop fit within our whole measurement practice. We looked at Scoop not trying to solve for Google Analytics, and not also trying to solve for a real time tool because we were using both of those. Where Google Analytics provided this huge library of where we could dig and slice, and find advanced segmentation down to a granular level and say, “Oh, what happened here?”
Also, we weren’t trying to do what the real time tools, like Chartbeat or Parse.ly, were doing, which is what’s happening right now on the website because we have both of those needs. Where this fit is somewhere in the middle is how did this do? So, answering that question of that Aaron said, this is what we got all the time, and if we could automate that we could free ourselves up to do more of the data digging and making the tool be more useful to the users.
Kristina: So, in this context then, you were largely measuring readership and engagement, and shares, and so on, correct? Because it’s a journalism ... it’s a publication, like a magazine, right?
Mike: Yeah, yeah.
Kristina: Okay. So, let me ask you this, are there ... let’s say that I’m on a team that is measuring interaction with website content, for example, or on a team that is measuring whether or not people can follow instructions on like a mobile app or so on. Is it not just talking about the editorial product, but tell me, or like the editorial dashboard product Scoop. Tell me how would you encourage those folks to start thinking about which metrics matter.
Aaron: When I talk about content engagement I usually try to solicit feedback about a person’s fit within a department and an organization. So, speaking about higher ed because that’s my bag, it’s like there’s content strategy goals all over the place in higher education because most institutions are trying to do a few key themes. So, let’s just take a generic example like recruiting and retaining students for admission.
Institutions always want that to happen, and that goal gets parsed down to an individual department. So, let’s say you work in the history department and you want more history majors. The content strategy angle for that is well, we need to write content that’s engaging but also, has a certain call to action. Like “Please register for our program.”
I try to connect dots from the very top all the way down into okay, so supposedly you’re writing content that’s going to encourage people to apply. We’ve got to find a behavior metric. We’ve got to find some sort of user action that’ll help us determine whether or not that happens. This is not always one to one, and it’s not always very easy. But, if there’s ever a moment or a place or an interactive element that helps you say, “Yep, we got another one.” That’s your conversion metric. That’s your conversion point, and the things that lead up to that form of measurement stream, if you will, of data that you need to benchmark on the one hand. What is typical? So, that’s what we mean when we say benchmarking.
We need to look for what’s typical, and then, the other thing I usually tell people that I think Mike and I would do this in tandem, we encourage people to try new things. So, measurement is really nothing about experimentation. You’ve got to be able to see where the data lies, and then ask yourself, “What can I do to change this?” So, you’ve got to try new things. You’ve got to embrace Miss Frizzle’s Magic School Bus rules. You know, you’ve got to get in there, make mistakes, get creative, and get messy. And try new things with your content and see where that takes the numbers. If they take them in a direction you don’t want them to go, then try something completely different and see if they go into a direction you do want them to go.
Mike: So, then with our public affairs team, being a news org, you know we’re not specifically trying to drive registrations, butts in seats for events, donors. It’s kind of a general awareness, sentiment toward Harvard, overall reach to specific cohorts. So, what we try to identify is the things like scroll depth and read time, because we understood that that would tell us a bit more about the readers coming there, and did they get what they were promised, and did they enjoy it.
We also measured recirculation. So, not only are they reading this story, but are there other stories of interest for them. We also took a lot of time to pair up marketing channels with the engagement to help identify if we’re promoting these types of stories on social media, does that align with the interest of that type of audience versus others.
So, we’re really trying to find the metrics on our site to align with what Aaron said, which is sort of the ultimate user outcome. But then, also upstream that and say, “Are we doing the right things to bring the right people in to perform these actions as well?”
So, it wasn’t always an easy path. We didn’t have a cut and dry calculation, but as we started adding more and more of these types of metrics to the dashboard we started exposing questions from the writers saying, “You know, I wonder if adding more photos to a story would ever get people to spend more time or have a better experience?” Then, we said, “Great, we can figure out how to measure that.”
Kristina: Yeah, you know I think that is probably my biggest takeaway from this conversation, is that an amazing, awesome metrics dashboard isn’t going to be amazing or awesome or useful if you’re not ... if you don’t remain in constant conversation with the people who are actually looking at, and understanding the metrics, asking things that would help them basically do their jobs better. That just seems to have been such a huge part in the success of building out Scoop.
Aaron: Yeah, totally. And I think a lot of cases people are also asking questions along the way, and you start to realize there’s these shared stories or anecdotes that are out there that you don’t know where they came from. And then, you find out they did come from data.
So, I think in showing them more data you start to have these conversations, and questions, sort of assumptions or questions, these things that have been long-held beliefs within a department. You can start to just show them performance and, that totally changes the conversation into “How do we improve that?” rather than like “I’m not agreeing with that.”
Kristina: Right. And you know, one of the things that we saw in the feedback to your talk, a lot of people were like, “Where can I get this? When is this going to be open source? This is amazing and I want it.” And you know, Aaron, you and I were talking before we started recording, just sort of like, “Well, this is something that we ...” that you guys built out for a very specific need within Harvard, but my perspective is that in any organization, really higher ed or otherwise, there’s so many common questions and pain points, and stories around content that really are just not that different from organization to organization.
So, it’s not surprising to me that people in the audience were just like, “Where do I get me some of that?”
Aaron: Yeah, and I think that you know, the thing I want to ... I don’t want to say that you can’t have it because we’re pretty open with like, there’s nothing special about this code base, except that it’s just really specific to us. And we ... my apology to everyone is that when we built this we just built it for us. We didn’t think about building it to be open source. So, that’s just going to take a minute.
I think we’ll get there. I think we’ll have something one day for people to download and use but, it’s got to solve a more broad question than what we specifically implemented here.
You know, the nature of the questioning is actually where I like to take people. People are like, “I want it. Where can I get it?” The answer is you can do all of this on your own with pre-existing tools. It takes some number crunching but the whole point of Scoop is just to set a baseline for expectations. Even if you had to do that by hand, by just generally looking at something even after it’s been published for a few days, just making mental notes of “Oh, this is what I can expect for my own work after a certain period of time.” Like it doesn’t necessarily take the technology that we’ve built to automate that. You know, you can do a lot of that stuff just by your own notation.
Kristina: Yeah, and I certainly don’t think that ... we didn’t ask you guys to speak on this topic because we were just like, “Then, someday they’ll sell it, and everyone will have it.” It really ... I mean, the point is exactly what you just described, which is we really ... I think the majority of the audience is way more interested in the process of how did you identify the need, where were your sources for the data, how did you make decisions about what to include, and how did the thing evolve?
And you know, this idea of ongoing experimentation, constant iteration, being in direct face to face conversations with the people who were getting the data and using the data. That’s the story of Scoop.
Mike: We also, I mean, we in the early stages looked far and wide to see if there was a turnkey solution that could do a lot of this for us. There really just wasn’t. It sort of became this, you know, “What if we just tried this thing? What if we added this?” So we kept sort of attaching pieces to Mr. Potato Head to say, “Oh, I think it’s looking good now.” Then, we added one more and one more. It got to a point I think about a year and a half, two years in, where we started realizing wait, people are asking us about this.
We had deans from other departments telling their comms teams to build what our department built for them. So, that’s when we started really realizing, wait, this is at a different tier than just starting by automating some of the numbers collection. It became more of a product that we really needed to think about how would this thing scale beyond the people within our own department, but how would it scale across Harvard? How would it scale beyond that?
I know Aaron spent a lot of time thinking through that as he got into that sort of next phase.
Kristina: Well, and ultimately I think that this story can, and should be, really empowering for content strategists and analytics folks that yeah, there might not be a turnkey solution out there free, or super expensive. And that it is likely that you clearly are learning along the way. I mean, you didn’t go to school for this stuff. None of us went to school for this stuff.
So, there is no reason that people can’t go out and start asking these questions, and trying new stuff. It’s inspiring. It’s smart. It’s amazing. You guys are terrific, and I really very much appreciate your being able to come and speak at Confab this year, and for taking the time to be my guests on The Content Strategy Podcast. So, thank you so much.
Aaron: Thank you Kristina.
Mike: Thank you very much. Thanks for everything you do too. You’re awesome.
Kristina: Aww.
Aaron: I remain your biggest fan boy.
Kristina: I paid them $20 each to say that.
---
You’ve been listening to The Content Strategy Podcast. I’m your host, Kristina Halvorson. This podcast is brought to you by ContentStrategy.com and Brain Traffic, a content strategy consultancy. Find out more about Brain Traffic at, of course, BrainTraffic.com. Thanks and we’ll see you next time.
The Content Strategy Podcast is a show for people who care about content. Join host Kristina Halvorson and guests for a show dedicated to the practice (and occasional art form) of content strategy. Listen in as they discuss hot topics in digital content and share their expert insight on making content work. Brought to you by Brain Traffic, the world’s leading content strategy agency.
Follow @BrainTraffic and @halvorson on Twitter for new episode releases.
Get valuable content strategy resources and insights delivered right to your inbox.