Omnichannel Measurement Strategy
Design measurement frameworks that connect media, web, CRM, and claims data into a coherent view of campaign performance.
Healthcare marketing analytics leader with 15+ years across pharma, HCP, DTC, omnichannel measurement, web analytics, QA automation, dashboards, and executive storytelling.
“I find what others miss in the data.”
I help healthcare and pharma teams find the signal in fragmented marketing data, build trusted measurement frameworks, and turn dashboards into decisions.
With over 15 years in healthcare marketing analytics, I have worked across major pharma brands, HCP and DTC campaigns, omnichannel media, web analytics, CRM, claims-based measurement, and executive reporting. My career has been shaped by agencies including Publicis Health, Digitas Health, and Epsilon — environments where the data is complex, the stakes are high, and the clients need answers that hold up under scrutiny.
I work across the full analytics lifecycle: defining the right questions, connecting the right data sources, building measurement infrastructure, QA-ing the inputs, and translating performance into insight that drives action. I am as comfortable in an Excel workbook building a measurement logic layer as I am presenting findings to a VP of Marketing.
“I work where marketing strategy, messy data, and business decisions meet.”
Design measurement frameworks that connect media, web, CRM, and claims data into a coherent view of campaign performance.
Define the right KPIs for each journey stage, tactic, and audience so performance reporting connects to business goals.
Validate and troubleshoot GA4, Adobe Analytics, and GTM implementations to ensure data collection is trustworthy from the start.
Build reporting systems that create trusted sources of truth and dramatically reduce the manual effort required each reporting cycle.
Translate complex analytics into clean slides that answer “so what?” and lead stakeholders from data to decision.
Connect media exposure to real-world prescription and patient data to measure audience quality and campaign impact.
Design and analyze holdout tests, exposed vs. unexposed comparisons, and incrementality studies to measure what media is truly driving.
Build repeatable QA processes that replace manual file checking with automated validation, exception flagging, and audit trails.
Design Excel workbooks that function as micro-databases: lookup logic, pivot structures, linked outputs, and repeatable templates.
Lead measurement conversations across media agencies, analytics vendors, and client stakeholders to reach shared definitions before the campaign launches.
Measure healthcare professional and patient-facing campaigns across digital, search, display, email, and content channels.
Apply AI tools to accelerate analysis, automate routine tasks, and improve analytical throughput — without replacing judgment.
Connect tactics to journey stage, KPI, data source, reporting cadence, and decision use. A good framework tells you what to measure, why it matters, where the data lives, and what action follows each signal.
Build dashboards and reporting systems that reduce manual effort, establish a single trusted source of truth, and give teams the ability to answer client questions confidently — without scrambling for data mid-call.
Replace manual file-by-file checking with repeatable QA pipelines that scan folders, validate formats, flag exceptions, and generate audit reports. Less analyst time. More consistent outputs.
GTM, GA4, and Adobe-style implementation QA: tag validation, bot and noise detection, consent impact analysis, traffic source diagnosis, form path performance, and behavioral pattern review.
Convert complex data into clean, decisive presentations. The goal is not a data dump — it is a clear narrative: here is what happened, here is why it matters, here is what we should do next.
Teach teams to think clearly through data: pivot tables, lookup structures, Power Query logic, QA checklists, and repeatable reporting templates. Turn report builders into insight generators.
All examples are anonymized and sanitized. No client data, campaign IDs, or proprietary metrics are displayed.
A publisher promised high audience delivery for a sponsored content center, but reported delivery numbers did not match observed behavior in the traffic data.
Audited traffic patterns across time, monitored competitive activity, used a market disruption as a natural experiment, and compiled evidence into a structured presentation for leadership.
Secured make-good value from the publisher, reduced spend against low-quality delivery, and reallocated budget to higher-performing tactics.
Reach metrics showed impressions delivered, but gave no evidence that media was actually reaching the diagnosed patient audience. Reach without relevance is waste.
Connected media exposure data to claims-based validation, benchmarked audience quality against real-world diagnosed prevalence, and compared exposed versus unexposed groups.
Identified higher-quality publishers, reframed the optimization conversation from cost-per-impression to cost-per-qualified-reach, and informed spend reallocation.
Monthly data QA required manually opening multiple vendor files, checking row counts, NPI formats, date structures, and file completeness. The process was time-consuming and inconsistent across analysts.
Used AI-assisted coding to build a repeatable folder-level QA script that scanned incoming files, validated against defined rules, and produced a structured exception report.
Reduced manual QA effort significantly, improved consistency across analysts, and surfaced data quality issues earlier — before downstream reporting was affected.
Teams were producing reports on schedule, but struggled to answer follow-up questions quickly or consistently. Each question required rebuilding the analysis from scratch.
Built Excel-based micro-database structures with lookup tables, pivot-driven summaries, and PowerPoint-linked outputs that could be refreshed and queried without manual rebuilding.
Improved reporting speed, increased client confidence, and gave teams the ability to answer deeper questions live — without scrambling to pull new data mid-call.
Campaign reporting was organized by channel, not by business goal. It was impossible to see how tactics were moving HCPs or patients from awareness through consideration to conversion.
Mapped each tactic to a journey stage, clarified the KPI for each stage, aligned internal and client stakeholders on definitions, and connected media signals to downstream business questions.
Improved strategic clarity in optimization conversations, reframed reporting from “channel performance” to “journey impact,” and gave leadership a cleaner view of where to invest.
Analysts were producing accurate, well-structured reports — but stopping there. The data was right. The interpretation was missing. Reports described what happened, not what it meant.
Created a “so what?” review framework, coached analysts to connect performance changes to business goals, and built a feedback loop tied to actual client questions.
Improved insight quality across the team, built analyst confidence in client-facing storytelling, and reduced the revision cycle between analyst draft and final deliverable.
Examples are summarized at a category level to respect client confidentiality.
Thumbnails are placeholders. Sanitized screenshots can be added to assets/dashboards/.
Business question: How are all campaign tactics performing against KPIs, by channel and journey stage?
Data inputs: Media platform exports, publisher reports, web events
Audience: Campaign leads, media teams, client marketing VP
Business question: Which pages, paths, and content are driving meaningful engagement vs. high bounce?
Data inputs: GA4 or Adobe Analytics export
Audience: Brand team, digital leads, agency analytics
Business question: How much of our web traffic is non-human? What should we filter before reporting?
Data inputs: GA4 raw event data, session logs
Audience: Analytics leads, web team
Business question: Which HCPs are engaging across touchpoints, and at what depth?
Data inputs: CRM, email, media, web, target lists
Audience: Field force, brand marketing, analytics
Business question: Which publisher files passed QA? Which had exceptions, and what kind?
Data inputs: Publisher PLD files, QA rule set
Audience: Analytics team, media team
Business question: What is the one-page campaign status a VP needs to see?
Data inputs: Aggregated KPI data across all tactics
Audience: VP Marketing, brand leadership
Business question: Can we trace a line from media exposure to audience quality to business outcome?
Data inputs: Media, claims, web, CRM
Audience: Analytics leadership, brand strategy
Business question: How do we produce consistent, on-time monthly reports without rebuilding the structure each cycle?
Data inputs: All primary campaign data sources
Audience: Analytics team, client stakeholders
Slide thumbnails are placeholders. Sanitized examples can be shared upon request.
Excel is not just a tool. It is the environment where most healthcare marketing analytics teams build their analytical instincts — and where poor habits compound into reporting problems. I help teams go further.
I have trained analysts and clients in practical Excel skills that improve reporting quality, reduce manual effort, and build the kind of repeatable systems that survive staff turnover and client questions.
“From report builder to insight generator.”
From basics to advanced pivot structures that answer real business questions.
VLOOKUP, INDEX/MATCH, and structured reference logic that reduces formula errors.
Reusable templates that can be refreshed each reporting cycle without rebuilding.
Structured QA routines that catch formatting errors, missing data, and logic breaks before they reach a client.
Power Query logic, repeatable refresh workflows, and removing the copy-paste step from the process.
Coaching analysts to move beyond the number and connect performance to business meaning.
I have developed analysts from report builders into genuine business thinkers — pushing beyond accurate numbers to interpretation, narrative, and recommendation.
Designed clear handoff structures, templates, and QA protocols that make global team collaboration repeatable and low-friction.
Facilitated measurement alignment across media agencies, analytics vendors, client teams, and creative partners — keeping discussions structured and productive.
Bridged the gap between clients, media teams, analytics teams, and vendors — translating technical findings into business language and turning ambiguous questions into structured analysis plans.
Built reporting workflows that teams can sustain independently — clear templates, ownership, timelines, and quality gates built into the process.
Turn vague briefs and ambiguous business questions into clear analytical plans: defined question, data needed, method, output, and decision framework.
I inspect what others assume is correct.
I like clean data, clear logic, and useful reporting.
I enjoy translating complex data for non-technical stakeholders.
I ask “so what?” until the answer connects to a decision.
I like building repeatable systems that save time and reduce analyst toil.
I enjoy teaching analysts how to think beyond the numbers.
I am energized by messy problems, fragmented datasets, and hidden signals.
I care about doing the work right — not just getting it done.
Open to healthcare analytics, omnichannel strategy, marketing measurement, business insights, BI, and analytics leadership roles.
Philadelphia Metro · Glenside, PA