Game Theory in Splitting Rent and Utilities (Part 2)
A month ago I proposed an incentive and penalty system (Mechanism Design) to solve the chipping-in problem at my friend's rented house. The idea was: if someone is late paying the house fund, the wifi password is immediately changed and their access is revoked. According to my Game Theory calculations, this would force everyone to choose the Cooperate option and abandon the Defect option.
Yesterday my friend came and gave a report from that field experiment.
It turns out, the theory worked. In the first month, once one person paid late and immediately felt the coldness of life without wifi, the next day everyone automatically transferred the house funds diligently on the first of the month. The house treasury system ran smoothly, the water gallon was always full, and the electricity token never beeped in the middle of the night.
But, as always, there was a side effect that was not rendered in my mathematical model.
This overly rigid penalty system created a toxic and zero-trust environment. His housemates became extremely calculating. If A bought the gallon and got a chipping-in discount, B got jealous and the next day hid the empty gallon in his room so only he would know and could buy it first to get the reward points.
Eventually, their friendship turned into a cutthroat competition to get resources. Moral values and togetherness were destroyed, replaced by profit and loss calculations.
This is what in economics is called Goodhart's Law: "When a measure becomes a target, it ceases to be a good measure."
Initially, their target was to make the house comfortable through a fair chipping-in system. But because the system was judged by points and discounts, their target changed to how to farm as many points as possible or avoid fines as narrowly as possible.
From this I learned that Mechanism Design cannot be applied 100 percent raw to social relationships involving emotions. Humans are not purely rational agents (Homo Economicus) who only think about utility. Humans have ego, jealousy, and pride.
So how do you patch this system?
I suggested my friend apply the Tit-for-Tat algorithm with Forgiveness. In Game Theory (especially in the iterated prisoner's dilemma tournament), this is the most successful strategy of all time.
Here is how it works: in the first iteration, you always choose Cooperate (trust your friend). If in the second iteration they Defect (break a promise), you reply with Defect (give a penalty). But, and this is the most important part, in the third iteration you must give Forgiveness and return to Cooperate if they show good faith.
Do not make your penalty system permanent or too cruel. If they pay late because their father actually has not received his salary yet, do not just immediately cut their wifi. Give some leeway, but still record it as a debt. A good system must have tolerance for edge cases and life anomalies.
So, Game Theory is indeed very helpful to structure a problem. But during implementation, you still need an empathy module so your system is not rigid and ends up destroying your own friendship database.
- Khay
Nash Equilibrium in Dormitory Households
Chipping in for electricity tokens or dish soap is a classic Prisoner's Dilemma. If everyone decides to pay (cooperate), the dorm remains clean and illuminated. But there is always a systemic incentive to defect (pretending you don't know the token is empty) in hopes that someone else will pay first.
The bug here is that if everyone chooses to defect, everyone ends up showering in the dark waiting for the water to shut off. We must architect a Nash Equilibrium where absolutely no one benefits from acting deceitfully. How? Create a social smart contract. Tape a ledger to the fridge. Whoever fails to pay gets their Wi-Fi bandwidth throttled to 10 KB/s via the dorm's Mikrotik router. Sometimes, positive incentive systems fail, and you must deploy systemic punishment to make the laws of Game Theory work in your favor.