I hate accounting policy.
For my 403(b) retirement plan, my employer contributes “5% of [my] base salary up to $34,000; 10% of [my] base salary above $34,000.” If I make (hypothetically) $68,000, then I should get 5% on half of it, and 10% on the other half — 7.5% on average. That’s $425 per month.
Sixth graders, are you following this?
Now let’s all guess why my last paycheck shows only $283 (adjusted for my hypothetical salary, of course). What happened to the other $142?
It turns out my “averaging” technique is wrong. My employer will contribute 5% until they’ve covered $34,000 in earnings, and then switch to paying 10% for the rest of the year. Most years this is perfectly fine, since I can’t spend this money until long after the year ends, so the total is what counts.
However, what about this year? If I was only eligible to join the plan in July (halfway through the year), then I’ll only see 5% contributions all year! I miss out on that 2.5% difference.
Of course, I used a $68,000 example salary to make the math easier, but really anybody who makes more than that cutoff amount suffers to some degree. I’m outraged!
(Okay, I’m not really outraged, but I am mildly disappointed to learn that some of my unhatched chickens will never hatch—and they were my retirement chickens!)
This is like the well-known problem:
A train leaves Chicago at 10am going east at 45MPH. Another train leaves at 11am going west at 75MPH. Where do they bury the survivors?
In the retirement version: A person starts work in 2005 at 5%. In 2008, he/she switches to 10%. Where do they bury the survivors?