In order to explain what has happened I need to provide a short discussion of PG&E rates, rates schedules and those things that show up on our bill that make no sense at all to almost everyone. There is a LOT that could be said about this topic, but for the moment I will attempt to keep it simple. Similar issues apply to rate schedules for utilities other than PG&E, but the details are different.
Here goes:
PG&E residential rates fall into two categories, time-of-use and non-time-of-use. The typical homeowner is on E1 rates, which has a constant rate structure (with important seasonal changes) but not by time of the day. E-6 is the time-of-use option where the rates are dependent upon the hour that power is used (or produced). Since most homeowners are on E1, and because the discussion gets much more complicated for E6, I am going to limit this discussion to E1. There are five rate tiers that increase in cost during a billing period (a month) as the amount of power use increases. Just to make things a little more confusing, there is a summer baseline amount and a different winter baseline amount. For example, in my area the 2014 electrical rates are:
Summer baseline = 13.8 kWhr/day
Winter baseline = 11.2 kWhr/day
- $0.14707/kWhr for tier 1 (power used within the baseline amount)
- $0.17028 for tier 2 (>100% to 130% of baseline)
- $0.25859 for tier 3 (>130% to 200% of baseline)
- $0.31859 for tier four and five. (above 200% of baseline)
PG&E has frequent rate increases, averaging something like 5% - 6% a year. This rate of increase is higher than inflation, but low enough to keep the frogs from jumping out of the pot. They usually don't advertise a rate increase in terms of percentage, instead they talk about the increase in cost to the average user dollar increase. This year this is about $5 per month - not much is seems.
According to the California Public Utilities Commission (CPUC) The baseline is supposedly about 50% of the average customer use. Therefore the "average" user should expect that half of their electricity use will fall within tier 1 (the baseline amount). Tier 3 ends at 200% of the baseline, meaning that the average user gets billed the entire tier 1, tier 2 and tier 3 amount which combined creates their "average" rate of about $0.22/kWhr in 2014. Heavy users get an additional surcharge for their use above tier 3. The $5/month is spread out across that range for an average increases of the various baselines, tiers and seasons. The average cost is about $146/month. The about $5/month increase is about 3.4%.
However it isn't so easy. Looking back one year to 2013 the situation was very different.
Summer baseline = 18 kWhr/day
Winter baseline = 28.6 kWhr/day
- $0.13230 tier 1
- $0.15040 tier 2
- $0.31114 tier 3
- $0.35114 tier 4 and above
Summer baseline was 18 kWhr in 2013 (it is 13.8 today), winter baseline was 28.6 kWhr (11.2 today). This means that last year it took 540 kWhr per month in summer to get out of the baseline, this year it takes 414 kWhr; last year it took 858 kWhr in winter and this year is is only 336 kWhr. The rates don't look all that much different, but the amount of power in the lower rates before moving to higher tiers is MUCH lower. The smaller sizes for the baseline amounts means that the higher rates are reached sooner - pushing up the average rate much faster than the rates themselves would indicate.
The average cost per month last year based upon the same assumptions of the definition of "baseline" (taken from the California Energy Commission) a twelve 30 day months, and the same power use in 2014 as was used in 2013 results in an annual cost of $1343 for 2013 and an annual cost of $1706 for 2014. This is a 27% increase in cost for the "average" family (about $30 per month). From the looks of the numbers, it is pretty clear that the 1/2 of the users who use less than the average got a large rate increase, while the 1/2 of the users who use lots of power got a large rate decrease. On the "average" it is about the same - the difference is entirely in who gets to pay pays more and who pays less. The third tier got a nice 17% decrease, while those using more only got a nice 10% decrease in rates. However, the folks who are careful with the power use got a substantial increase in their costs. They paid for the decreased costs of the large users, and much more. Averages don't necessarily show the important details.
Finding this to be fascinating, I started wondering what impact this dramatic shift in baselines and rates might have on the economics of energy efficiency and roof top solar systems.
This might be a good time to introduce the concept of "net metering" of solar systems. The idea is pretty simple. If you have a solar system under a "net metering" agreement, then there is only one power bill a year at the time of "true-up" which is usually the anniversary of when the system was first powered up. As time goes along during the year power either flows from the utility, or toward the utility when the solar system is making more power than the building is using. The meter readings increase and decrease depending upon which way the power is flowing. At the end of each month a tally is made and you are either credited or charged for the cost of the power in that month. The value of the power is based upon the daily average use and the daily baseline quantity. Using this scheme excess power made at a time of excess production gets credited to the cost of power when there isn't sufficient solar production to cover the use. If the rate is a time-of-use (TOU) agreement it is a little bit more complicated but a similar approach is used tallying monthly use and production in the various TOU time slots. (If anyone is interested I can describe what happens in that case - it is totally mind boggling and yet another way that the utilities and the PUC distort what should be an obvious concept - but I don 't have nearly enough space in this discussion for that).
The important point about this is that the immediate amount of power being used or produced doesn't matter, it is a monthly average. In general, solar systems are designed to produce an annual average amount of power to offset the annual use ("net zero" energy use). While the rates in the tiers are constant throughout the year, the baseline amount is different in summer than in winter. While there was a modest shrinkage with the 2014 summer baseline being about 76% of 2013 baseline, there was a very large decrease to 39% of the 2013 winter baseline. Most power in the summer is made and used on site for air conditioning, some is saved for winter. During the winter when there is not so much solar, the baseline shrank substantially meaning that most of the power is billed at the higher tiers. This has the effect of making solar power much less attractive. It is produced and credited at low summer rates but offset by high winter rates. This change makes a large shift in the economic value of a solar investment.
In my case, when I installed my solar system it was designed and operated to get close to net zero, but not make extra. For the first couple of years our annual true-up bill was about $60 a year. However, when the baseline was changed, that turned into a bill of about $400 a year even though we have been continually making energy efficiency improvements in many small, and not so small, ways. While this is not a tremendous difference, it came as a surprise and was certainly not in my calculations for determining the financial viability of the investment. Large, unanticipated, changes in the cost of power, and the way that it is credited back to the user creates an unstable investment environment.
The same logic applies to retrofits or improvements in energy efficiency of the building envelope and appliances. For my home, when I installed solar the top tier for my rate schedule was $0.44/kWhr, now it is $0.25/kWhr. For an example of the importance of this change, consider changing from a incandescent light to a new high efficiency (and relatively expensive) LED light. Assume replacing a $1 - 70 watt incandescent light with a $10 - 9 watt replacement. If the light operates 6 hours a day, everyday, at $0.44/kWhr the cost of the 70 watt light is $153/year while the 9 watt light costs $8.67 (a $135/yr savings after paying off the lamp!) At the new rate of $0.25/kWhr the cost of operating the 70 watt light is only $38 and the cost of the LED is $4.90/yr (a $25 savings). This is a difference of $110 a year - making a huge impact on the value of an investment, potentially changing a great investment in energy improvements to a questionable, or negative, one. While the "average" rate for electricity has remained relatively constant (about 5% per year increase) during that time, there has been a very large shift in how the rates are distributed.
It is interesting how things are changing in the power pricing world. It appears that the more people attempt to conserve and use less power, the higher rates they pay - and the more they use the bigger the incentive to continue doing the same.
No comments:
Post a Comment