Blogs

Insiders' Insight KPI - January 2024

By VHMA Admin posted 01-12-2024 21:24

  
Growth in revenue, patient visits, and new client numbers declined in December 2023 compared to December 2022. (Note, however, that December 2023 had one less workday compared to December 2022 so this decline may not be due to true changes in practice operations or client demand.)  
 
As we head into 2024, planning for improvement in the New Year continues to be important. Last month we talked about preparing or updating the practice’s budget; this month is a good time for practices to either start or continue to analyze their 2023 data, both by tracking internal trends and comparing the practice to published comparative data.
 
Unfortunately, some of the published benchmark sources the profession has relied on for years are either no longer available, are being published less frequently, or don’t contain the depth of data we are used to seeing. Other data sources have popped up, some of which have good quality data and others that may or may not. Data sources range from formal studies with reasonable statistical validity to informal compilations of data with no statistical analysis done to random “in my experience” comments about what a data point should be that may or may not provide a valid comparison for your practice.

Before drawing any significant management conclusions from a comparison of your practice’s data to outside published data, it’s important to understand how the outside data was obtained and compiled. Any good quality study will have a methodology section with this information.

Key factors to remember when performing comparisons to published data:
  • How old is the data? Some metrics don’t fluctuate much over time, but others do, and the age of the data may impact the usefulness of the comparison. This is particularly true with revenue numbers that are impacted by regular fee increases.  
  • Is the data meant to represent an average practice or “best practices”? Different studies have different purposes.
  • Is the data comparative? The methodology section of the reports will help you understand how many practices responded and how comparative the data is.  Comparativeness means how similar are the practices in the analysis to your practice. If all of the practices included in the study are small animal and your practice is a mixed animal practice, using this data may provide a very poor comparison, i.e. it’s a “garbage-in, garbage-out” kind of situation. In general, the more practices are included in the study, the more likely the numbers are to be representative of the profession although this isn’t the only factor that should be considered in deciding if the data is comparative and useful. The best studies are designed to be representative of the profession. For example, if 35% of the practices in the United States are located in the Northeast census region; then 35% of the practices in the study will be from that region. The same goes for other practice attributes such as size and type of practice.
  • It is also important to understand how certain calculations were performed in order to know if that metric is comparable to the user’s practice. For example, the definition of an active client must be the same between the benchmarking study and the individual practice under analysis. If a study uses 12 months as the definition of an active client but your practice considers an active client to be one who visits every 15 months, then this isn’t an apples-to-apples comparison, and you may think your practice is doing better than it actually is.
  • As data is cut into ever smaller units (for example, by region or state), sample sizes get smaller. I’ve seen studies publish state-by-state averages based on just two or three data points; better studies will generally not publish data unless there are a larger, more reliable number of data points.  
No study will be perfectly comparable to all practices. For example, an individual practice may be comparing its current data to a study with data that is three years old. Or different items in the measure of personal doctor revenue production may be used in the practice than included in the study information. This doesn’t mean the study is useless. It may still be possible to get valuable information to help operate the business more effectively. It simply means these comparisons must be used with caution and as just one tool in running the business, not as the final word about how well a practice is doing.
 
It’s also important to remember that just because a particular practice is different from the comparative practices included in a study doesn’t mean there is something wrong with that practice. All practices are unique in one way or another and the published benchmarks may not be the standards that apply to a particular practice. However, if the numbers are significantly different, it is strongly recommended that the reasons for these differences be investigated in order to determine if there is a financial or operational problem that needs to be resolved. I’m a big fan of using comparative data to gain a better understanding of how well a practice is doing but it’s critical that you understand the source and the reliability.  

Download Insiders' Insights - KPI, January 2024 Report

➤ VHMA Members can access the dashboard to drill down by region, species, and practice size filters, access the interactive KPI dashboard.

Karen E. Felsted, CPA, MS, DVM, CVPM, CVA of PantheraT Veterinary Management Consulting, www.PantheraT.com, provides data review and commentary.

#InsidersInsights-KPI

0 comments
172 views

Permalink