A few weeks ago, I suddenly had the idea to calculate how much time I had spent on programming (including learning). This led to some reflections. The calculation formula is as follows.

Average hours per day × Number of days in a year × Total number of years

I started programming in the spring of 2013 and decided to fully commit to it in 2014. On average, I spend between 5 and 16 hours a day coding, non-stop, including weekends. My time is easy to calculate because, aside from specific times when I go out, I spend most of my time coding, except for eating, sleeping, and other necessities. I’ve always had trouble sleeping, which means I don’t sleep much during the day, and often don’t sleep at all. So I often end up coding for over 16 hours (the longest stretch was probably over 36 hours). 🤔 This is actually quite painful for me, but I won’t go into detail here—I just want to record how much time I’ve spent.

Let’s start calculating:

Upper limit: 365 * 12 * 16 = 70,080 (hours)  
Average: 365 * 12 * ((16 + 5) / 2) = 45,990 (hours)  
Lower limit: 365 * 12 * 5 = 21,900 (hours)

Average of upper and lower limits: (21,900 + 70,080) / 2 = 46,350 (hours)

Although my average is roughly 40,000 hours, I feel it should be between 50,000 and 60,000. This is mainly because 5 hours a day is the absolute minimum; typically, 10 hours is the most common amount of time spent.

I once heard that if you invest over 100 hours in the same thing, you’ve already surpassed 80% of people. So what does investing 50,000 hours mean? Looking back, I realize I’ve come so far… It’s truly awe-inspiring!