My lessons learned from marketing performance analytics projects. It is totally individual, and probably not applicable to any situation. Still, I believe most points might be helpful for my peers in their tedious search of KPI, ROI, ROMI and other holy grails. Some advice might look a bit stretching, demanding, it is for sure easier to say than to follow. Nevertheless, taking these positions into consideration might have made my life better, and my working hours shorter. Take this as my address to myself – for the past and future projects.
- Be a Data Sense Maker, not a Calculator. There are two sides of analytics – operational and analytical. Operational side means a role of the data processor, calculator, who thoughtlessly applies known parameters and formulas, and return numbers, ratios, and percentages. You are not responsible for skewed values, data incompleteness, errors and incoherent output – you just do your work. Honestly, this is not a place where you want to be. It is a low job security, as those functions can be automated. Most important, it brings poor value. Numbers mean nothing until they make sense and tell the story. Thus whenever you can, transform your operational role into managerial and analytical one – provide sensible data, bridge all gaps, make sure metrics are accurate, and overall output is relevant. One step forward is explaining the data, making conclusions and recommendations, identifying trends and pain points, justifying expenses.
- Stay tuned to your stakeholders’ needs. Be very attentive to the management requests, concerns, points of view. The data you provide should help and address those needs, should speak the same language they speak. Listen and take notes, be super sensitive to what is in air. A slight change of a single parameter might give you drastically different results, and the more data, the bigger the difference. So you truly want to know what they want.
- Know your metrics. Define your metrics as precise as possible, in all details and nuances. Explain what they mean, use commonly used language and terminology (or establish ones) to avoid any misunderstanding, run it by your manager and stakeholders, and have it approved. Keep the description open, and as soon as you notice any wonkiness, concern, question – write it down and have it solved, now or later.
The problem might occur when required metrics are not compatible or feasible within existing CRM, processes and practices. In this case, work it through with your team, align the metrics, look for workarounds, offer alternatives, and, yes, have it approved. - Mark Your Route. Document everything at every single step. It might be annoying, take time and efforts, but it is worth it. Save your report parameters, dates when you pull the numbers, keep your raw data files for future reference. Remember that data is live, and in a week it might give you another picture, in a year it will be unrecognizable. If you keep your backup files, you can always check, compare with other data later, tune and refine your original metrics.
Keep all your files and reports in perfect order with consistent naming convention, in a logical folder tree. Don’t rely on your memory or common sense! Imagine that someone else should be able to jump in – organize your work environment. - Validate your data. Compare your numbers with numbers from other sources, check benchmarking metrics, ask other teams and departments, make sure you are not missing anything important. If other parties use different parameters, either adjust yours or consider giving two points of view (yes, double work).
- Don’t over-complicate. Your output should be easy to understand and follow. Fewer, but solid and proved metrics are always better. Use as simple parameters as possible, avoid advanced filter logic wherever you can. It is better to find problems (non-standard values, wrong fields, missing data) and solve it forever than patch it with fancy filter strings.
- Know your data, but don’t be biased. You should know your standard measurement values by heart, have a solid understanding of the main data flows, volumes, be aware of existing tendencies and specifics. There should be no big surprises in what you get. At the same time stay away from any bias, and don’t make quick assumptions, keep a fresh eye. Avoid giving estimations and guesses – they might be wrong.
- Don’t assume that data is accurate. It is NOT! Constantly look for errors and mistakes, think of people who are involved in data management – what did they do different previously? What kind of mistakes did they make earlier? What practices were changed recently, while people might still work the old habits? Look and check.
- Use technology to help you. Investing your time in mastering Access, advanced functionality of Excel, analytical tools, and platforms. If you do any procedure repeatedly – find a way to automate, program, model it.
- Turn your complaints into a plan. Is a project completed? You are not done. Document all issues, inconsistencies, irregularities that you discovered during your deep dive into data, and turn it into a plan for further improvement. Data is getting cleaner; your work is getting easier (or more sophisticated). It’s a win-win!
Several notes on traps that are very specific for marketing analytics. Beware:
- double-counting of everything (one campaigns sourced several opportunities, one opportunity is influenced by several campaigns, etc.)
- cross-counting results (one action might be logged and tracked by several campaigns)
- same labels for different objects or actions of different value (e.g. registered for a webinar vs. registered for a trial)
- different time frames (campaigns for current period for the longer time)
- using different chart types might return different totals (in Salesforce)
- different objects for similar concepts (leads vs. contacts in Salesforce).