Data analyst Hao Shen is following his growing suspicion. “We thought that more modern photovoltaic systems would be more reliable and efficient than those commissioned 10 years ago.” Shen is the head of data products at data analytics and climate insurance firm kWh-Analytics.
The company has compiled performance data from the portfolios of 15 of the 20 largest system operators in the United States. It then compared the revenue reports prepared before the plants went online and the actual production data. That comparison, covering 30% of ground-mounted plants in the United States, showed “just the opposite is true,” Shen says. The discrepancy between predictions and real output has increased.
The cause for concern is that a discrepancy between forecast and output is one with consequences. Whether a PV system is expected to generate 4,000, 3,920 or even just 3,760 GWh over a 20-year period is anything but a small matter. At €0.07/kWh, 240 GWh less yield means lost revenue of €16.8 million.
If fewer electrons flow from a PV asset than expected, it can put the owner in financial trouble. This is because project planners use the forecast from the yield report to determine the value of the plant. Banks and investors also want to see the yield report before they invest in a plant. They use the report to calculate the monthly cash flow. This determines how high the repayments for loans have to be.
If the report is off by just a few percentage points, the repayments become a burden for the operator. And that’s exactly what’s increasingly happening.
“We have found that appraisals are becoming more inaccurate,” says Hao Shen. The newer yield forecasts statistically overestimate the plant’s yield more often – by roughly 6% on average. In contrast, the mean of the old plants’ appraisals is more in line with actual production.
Shen offers several explanations for the observed discrepancy. For example, he says, current assumptions about system availability and component degradation rates are overly optimistic. Another problem is that the PAN files on which the yield forecasts are based are too “aggressive,” he says.
PAN files contain detailed information about a module that go beyond the data sheet. Appraisers can use the values in the files to calculate low-light performance or the effect of the angle of incidence on yield. However, anyone preparing an expert opinion should look carefully at where the PAN file comes from. Many small – and thus difficult to comprehend – changes to decisive parameters can make a PV module’s performance look better than it is.
If you calculate with fairytale PAN files, you cannot expect that the predicted yield will be achieved in reality. This is not without consequences. For PV projects in the United States, the basic annual risk of default is 16% if the credit line is 1.25 times the cash flow, Shen says. As misconceptions in revenue reports accumulate with each passing year, Shen estimates the risk of default is now 70% after seven years.
Still, there are plenty of financial incentives to sprinkle some fairy dust onto PAN files and thereby drive up yield appraisals. When a project developer is planning a plant, modules are offered and the corresponding PAN files provided by the manufacturers. If the yield of a module in the forecast is 1% to 2% higher than a competitor, it means that the project can generate several million more in revenues – at least on paper.
“Then it can also happen that the project developer contacts the manufacturer of the worse-performing module and tells them that they are a bit behind,” says Tristan Erion-Lorico, head of module business at testing institute PV Evolution Labs. “Here’s a new PAN file. Try it again,” sometimes comes as a response. Project developers may well have an interest in this approach. “If a developer is building photovoltaic systems in order to sell them at a profit shortly after the construction phase, every improvement in the PAN file means more money,” says Erion-Lorico.
Matthias Hadamscheck also knows the influence that optimistic PAN files can have on yield assessments. He heads technical consulting at the company Meteocontrol which, among other things, prepares expert opinions for project developers, investors and financing banks. “For a yield assessment, the plant is virtually built in the simulation software based on the information about the plant – such as the location, the components and their configuration,” he says. “With the appropriate weather data, the yield is calculated for the site.”
In PVsyst, a widely used modeling program for PV systems, information from the modules’ data sheets is stored. This includes, for example, the measurement under standard test conditions – that is, at an irradiation intensity of 1,000 watts per square meter and at 25 C. But how much electricity the module generates at 400W/m2 and 10 C cannot simply be read off the data sheet.
In order to estimate the behavior of a module under different lighting conditions and temperatures, one has to measure parameters that are usually only familiar to module experts. These include the series resistance and the shunt resistance, including its so-called exponential factor. In addition, there is the irradiation angle correction factor (IAM) and the gamma factor of the diodes.
This spectrum of measurements is not readily available, even in an R&D or test laboratory. A total of five parameters, also called the uncertain parameters, are decisive in the end. PVsyst sets the values for these parameters as a basic assumption, the “default values,” in the program, relying on its own empirical values.
“The most important uncertain parameters are series resistance and shunt resistance,” says André Mermoud, the founder and developer of PVsyst. In the past, many module manufacturers complained that the assumptions for “Rserie” and “Rshunt,” as they are called in PVsyst, were made too conservatively – making the modules in PVsyst look worse than they are.
Independent labs have also reported that PVsyst’s “default values” have led to conservative yield predictions in the past. However, the basic assumptions made in the current version of PVsyst, Mermoud says are, “pretty close to or just below what we see with most modules with valid low-light measurements.”
Module manufacturers can alternatively have a valid low-light measurement performed by an accredited laboratory to correct the perceived overly conservative base assumptions in PVsyst’s library. The modules are then tested not just once at 1,000W and 25C, but at 22 different temperatures and radiation intensities – at 15 C, 25 C, 50 C and 75 C and at various levels of radiation intensity between 100W/m2 and 1,100W/m2.
The procedure for this range of measurements is laid down in the IEC 61853 standard. PVsyst uses the low light measurement according to this IEC standard to adjust the uncertain parameters according to the 22 measurement points. In doing so, PVsyst modifies the uncertain parameters according to a very specific procedure until the simulated efficiency curve matches the measurement. The box above sets out these in more detail.
Shaft light measurements
“But in recent years, we have received very few low-light measurements from manufacturers,” Mermoud says. He estimates that between 20 and 25 manufacturers have submitted valid low-light measurements. This number, he reports, was higher for older modules. Over the past two or three years, he says, he has received hardly any such laboratory measurements.
So how do project developers safeguard themselves? For one thing, they can pay attention to where they get their PAN files. The labs that do the low-light measurements can also define the five uncertain parameters, based on the measurement.
Most PAN files today are created in independent laboratories. Anyone using such PAN files should be sure to request a report on how the values were defined. This can provide less technically oriented decision makers with the certainty that the values have been defined according to quality standards and can therefore also be used to make serious forecasts. There is no IEC standard specifying how exactly the laboratories transfer the measured values to the module model used by PVsyst, the single-diode model.
For the PAN files, Fraunhofer ISE, like other well-known laboratories that create PAN files, varies the parameters until they have produced the best possible match between simulated and measured efficiency curves, says Ulli Kräling, the quality manager at Fraunhofer ISE. The procedure may differ in PVsyst’s methodology.
“Different procedures exist to determine the series resistance Rserie, for example,” Kräling says. “However, this series resistance does not correspond to an actual resistance, but only represents a value that provides the correct result for the corresponding module model. If the procedure or the model changes, then other values for the series resistance result. That means there is no ‘correct’ series resistance.” But he also adds that it would not make sense to use a model that differs from PVsyst if the values were later used for a yield calculation with the software.
Further complicating matters is that not all labs follow the same process. Some labs use all five uncertain parameters as variables, which leads to unstable results, Mermoud says. “To me, this method can’t be reliable because it’s impossible to get a stable fit for five parameters in such a nonlinear equation and for such uncertain data points,” he argues.
As an example, Mermoud cites the value for the exponential factor of shunt resistance. According to PVsyst, that should be about minus 5.5 ohms. But, he reports, the value is sometimes as low as 0 ohms or as high as 10 to 15 ohms when all uncertain parameters are used as variables.
As to whether this methodological difference would calculate several percent more yield and thus explain Hao Shen’s results of six percent discrepancy, Mermoud negates. Nevertheless, he recommends that caution be exercised so as not to include PAN files processed in this way in the PVsyst library without carrying out further work. PVsyst cannot verify the exact methodology behind each PAN file from all laboratories. In addition to well-trusted industry labs, which have an interest in maintaining their reputation, there are labs that are more likely to produce very ambitious PAN files, he said.
Lack of integration
For this reason, PVsyst does not integrate PAN files created in the laboratory into its own database. The software provider wants to prevent unrealistic PAN files from appearing in its own library. And as a result does not offer a blanket quality control of PAN files, so yield simulation evaluators have to do the checking themselves. It is worth taking a closer look even at PAN files produced by an independent laboratory, says Hadamschek. While PAN files from “reputable” labs are more reliable than others, he says, “We look at all PAN files and evaluate them.”
Another way to be sure: PAN files created by independent external test labs, with corresponding reports on how the values were determined, give all parties involved in a project the assurance that the data is serious and reliable. In such a report, technically skilled users of the PAN file can understand how the laboratory arrived at the values of the uncertain parameters.
Last but not least, the PAN-file creation reports can also prevent the information from being subsequently falsified. This is also a problem, says PVEL’s Erion-Lorico. It has already happened that some module manufacturers have subsequently changed the values in the PAN file created by a reputable laboratory. There are many small adjustments that the authors of the PAN files and the yield reports can make to get a better result.
Angles of incidence
One parameter, Mermoud says, that has been enhanced very often lately describes the behavior of the glass when light falls on it at an angle. “This parameter can significantly increase the prediction compared to reality,” Mermoud says. “Many manufacturers offer their own PAN files with completely different incidence angle performances. This can result in a one to two percent underperformance in the simulation.”
Mermoud considers the irradiation angle correction factor data provided by many manufacturers to PVsyst to be “completely unrealistic.” PVsyst cites Sandia National Laboratories in the United States and says the Incidence Angle Modifier (IAM) factor should be 95 to 96% light transmission at 60 degrees of incidence, and also sets that value as a base assumption. But more and more manufacturers are specifying values of 99%.
Another observation that the authors of yield reports also make is the reflectance of light off the module. “We see a development that module manufacturers are increasingly providing PAN files that show extremely good reflectance behavior,” says Hadamscheck. On the other hand, it is the job of appraisers to filter out overly optimistic assumptions, if necessary. “If we are given very good values, we stick to certain minimum losses due to reflection,” says Hadamscheck. “The reason for this is that we don’t yet have a good and fully comprehensible technical explanation as to why the reflection behavior should be so good.”
Users of PAN files should also keep measurement uncertainty in mind. “From my experience, I have found that PAN files can give different values even for the completely identical product measured in different external test laboratories,” says a representative of a module manufacturer who requested to remain anonymous. “This is partly because of the measurement accuracy for some parameters and also because of possible other influences of the test procedures.”
Some module manufacturers can take advantage of measurement uncertainty, Erion-Lorico reports. There is an inherent measurement uncertainty in flash testing, according to the module specialist. He believes it is possible for some manufacturers to send their modules to five different labs or more.
“The results are then probably within plus or minus two percent of each other,” Erion-Lorico says – which is sufficient to skew forecasts. Manufacturers can then pick the best PAN file and give it to the project designer or reviewer. Another observation of Erion-Lorico’s should additionally lead to caution with PAN files: “I’m not aware of a single module manufacturer that gives a warranty on the performance calculated by a PAN file.”
All of these variables, discrepancies, and shades of grey can be exploited and turn fairytale forecasts into a nightmare for asset owners. “It’s death by a thousand cuts,” Erion-Lorico says. In the end, he says, it’s hard to track which parts of the PAN file were subject to aggressive assumptions. Those who are careless are left with high costs in cases of doubt.
A solution to the problem, however, lies not only in increasingly precise yield forecasts. The “safety cushions” that are included in the loan amount should also be reconsidered. In the United States, the industry standard is 1.3 times the capital flow, says Hao Shen. But more and more banks are leaning toward 1.25 times cash flow. That could exacerbate the problem in the future.