The GB settlement process is a hard-wired ‘black box’, mostly invisible to participants, dating back to the start of the competitive industry. But October should see the beginnings of an open flexible approach that allows new participants access to much more data. Elexon’s Peter Stanley talks to Janet Wood about opening up the system.
Peter Stanley is director of digital operations supporting the transformation of the energy industry at Elexon.
The industry-owned company manages payment allocation and flows for GB’s electricity industry. But this ‘balancing and settlement’ process takes place using a system that is not fit for purpose for the future industry, which will see millions of participants being charged or taking action on a half-hourly basis and served by very different energy companies.
Stanley recalls the starting point of Elexon’s system, which was “very specifically designed around how the market was constructed in the late 1990s -1998… with settlement administration and a whole notion of incentives around imbalance.” At that time there was a small number of generators and six main suppliers and the systems were hard coded around the way that market was intended to work. Now, he says, “We are seeing a huge transformation in the way the market operates, particularly in light of things like virtual lead parties who take advantage of much smaller units of either generation or demand response.
The sorts of changes we are seeing now around the flexibility markets and the aggregation of much smaller scale demand units or generation units, battery storage, means it is really challenging our ability to incorporate those solutions into [the system]”.
He adds, “We have had to really rethink the way we architect the systems themselves in order to build the flexibility in there, in order to accommodate business models that frankly we haven’t even thought of.”
Along with more market participants, the system has more data and faster action to accommodate. But Stanley says future-proofing is more than that. In order to enable new markets the system has to be “agnostic about market structures” and just as important, it has to be able to give users more insight. That means allowing for analytics of all the data on whatever basis is required by new business models .
Historically Elexon has produced standard reports on the market that have been useful for predicting supply and demand or pricing. But he says, “in future we can’t really predict who will want what. Part of what we are doing is opening up our data to make it more accessible and to allow organisations to bring their own thinking about analytics to the data. Rather than us push out raw data, we want to make it much more open and accessible.”
Another aims is flexibility inherent in the systems – not just technology, but also Elexon’s ability to undertake new things and be flexible in incorporating changes.
He says, “There is very little point in having whizzy technology without necessarily having processes that sit behind it. Introducing new products through agile methods is very important, so is engaging earlier with industry in order to understand what constitutes value”.
Part of that is making it easier to change the BSC code, and that process is being updated although it will remain a formal process – currently governed by the code panel although that may change. But Stanley thinks that equally important are processes at Elexon’s discretion, like the provision of data. “Where we are impacting settlement itself that has to be strictly governed but when you are talking about insight services, analytics services, they are far more within our own discretion as we have already demonstrated in the publishing we do via the Balancing Mechanism Reporting Services (BMRS).”
Talking about the experience of the old ‘black box’ system, Stanley says, “Data comes in, it gets processed, and you get an answer spewed out. What happens in the middle– over 20 years it has become increasingly difficult to understand what goes on inside that box.
“What we have done with the new architecture is break that box open and rebuild it in a set of much smaller discrete components, which then are orchestrated in a way that you can see the transactions as they move through the algorithm. That has a couple of benefits. Our ability to understand and effect change quickly, because you are only changing one small component at a time – and obviously test that more speedily. The other is the ability to look into the data … It opens up access to interim data sets as they move through the system – you are not only bound by the output – and you can extract it at any point.”
That will save time – at the moment if there is a failed settlement run because of bad data Elexon has to find the root cause in the data and restart. “Now we have an operator portal that allows us to see exactly where in the process it failed and why it failed and restart the process from that point.”
The new system – Elexon Kinnect – is a modular solution that lives in the ‘cloud’ and that brings flexibility. Elexon can scale the computer power up or down according to need. Stanley said that provides optionality in a future where instead of 48 half-hourly settlement periods “you can imagine a world where we move to 15 minute or even 5 minute settlement. At the moment that would require six times more computer power and time to process individual items, whereas in the new architecture we can spin up multiple algorithms to process them. And the growth in data we can handle and process through parallel systems.” That capability is being built into the new architecture for the market wide half hourly settlement programme.
Data boost
Better data will improve the settlement process. Stanley says assets, particularly at the distribution level have “not necessarily been well catalogued in terms of what is on the system and what isn’t” and half-hourly market wide settlement (HHMWS) will improve meter data which will improve our ability to compress the settlement timeline – which currently extends out to 14 months.
But Elexon is in inquiry mode. Stanley says: “we are keen to engage with a wide variety of our customers – particularly those that don’t represent typically generators or suppliers or with new or more innovative business models – so they can help us understand what they need to be successful.” At the minimum, Elexon understands that means “access to more, and more-granular, data so they can get a better understanding. Modifications that give us behind the meter data will be key to that, as well as this interim data.” But the company wants more customer views and it has set up a user group to help steer investment in its ‘Insights’ data platform. Stanley says, “We are very keen to get the voice of the customer helping us understand how we can help.”
He explains that historically Elexon has maintained elements of standing data and customer data, where it ensures that data are accurate. With faster switching and the need for accurate data for new markets, “The customer solution is moving to online self-serve, so we are placing the responsibility for managing and maintaining the data with participants themselves. Rather than them highlight issues that we have to correct, they manage the quality of their own data – as they are obliged to do.” Participants, he says, “should know their own data”.
The data is currently held in silos relating to each agent, but in the new architecture “all data will be in a single data lake and accessible in context for whatever data groups you need, and accessible through APIs as well as the standard file formats.”
Stanley says, “We can’t and wouldn’t lose the legacy file format, because many parties are dependent on it but we need to recognise that there will be new parties who don’t have these legacies who are looking for something much more open.” There will be different sets of data available – “One example is that forecast data on BMRS is available for 14 days three years ahead and that’s available daily. We store the data at the moment and there are only certain subsets that are available. In future all data, both final and interim data will be available, including real-time and forecast.”
When I ask whether that is future-proof for a changing market Stanley says “We can ingest any new data and will store it in our data platform and make it available. What is likely to change is development of APIs and we will keep up to date with the development of APIs to make the data available in the way people want it.”
The intention is to make all data available via advanced APIs (as well as existing formats), which will give alternative ways for parties, especially new entrants who are more technology-enabled, to get hold of data.
Could that data help companies weather difficult markets like the one GB is currently experiencing? Stanley is cautious: “There is certainly potential for greater availability of demand-side response and more storage capacity in Britain, which can improve security of supply. A key factor in making the most of this is understanding the capability and availability of smaller asset providers.
“This is where the greater volumes of data available through Kinnect will play an important part, together with major changes that we are making to the code which give more opportunities for smaller asset owners to provide balancing services.
“Through Kinnect data, participants will have much more granular information about the performance of DSR and storage units, which will make it easier for suppliers and networks to pick the most efficient actions to tackle tight supply and demand scenarios.”
Out of the sandbox
A key benefit of having the platform established within the cloud, and the infrastructure established as code, is the ability to test innovation. A ‘sandbox’ approach used in the industry in recent years allows new business models to enter the market via a derogation, which suspends some rules while the business case is established and the fear of unintended consequences or market distortion are allayed.
But on the new platform, Stanley says, that can be done without taking any risk in the real market.
“We can spin up an entirely parallel settlement environment, against which we can simulate those new business models and simulate the effect that there would be on the market – using real data in parallel. We don’t necessarily have to take a risk of creating market distortion. We can see the impact of it at all stages if the settlement process, so we can use that as the evidence or the basis for market rule changes to accommodate the business model.”
The next year
This fundamental change to the industry’s bedrock has to enable, and run alongside, major industry change. Stanley says, “We have a route map and we complete investment in other components in sync with industry change.” But when will the new options be available for innovators?
Although the intent is to complete migration of the whole platform from legacy by 2023, the insight platform is due to go live in October this year. “Innovation starts for a subset of data in November 2021. The remaining elements of data will be coming on through the insights platform in 2022, so we would be hoping to complete the development in 2022. The core of the data around settlement will be going live in June 2022 on completion of the settlement administration agent as part of the settlement solution.” The settlement administration agent performs the daily settlement runs.
Users “will see a rapid increase in the availability of data throughout 2022 and starting in October this year.”
But Stanley is keen to go further than a passive approach to opening the data up. He says, “We are not constrained by what we are rolling out. If there are new innovators emerging who are delayed by Elexon’s data I would encourage them to get in touch now, they don’t have to wait. If there really is something needed we would be very keen to understand that to get them up and running as soon as possible.
To join the Data and Reporting User Group (for the Kinnect Insights Solution) email [email protected].