|Intelligent analytics collect vast amounts of data
Today’s rate and speed of digital development is phenomenal, and this rate of change is mirrored in the power generation industry: rapid development of information technology is enabling increasingly sophisticated operation of combined-cycle gas turbine plant. Big data, the Internet of Things, wireless mesh networks and cloud computing are all making their mark.
Increases in computing speed and capacity have enabled collection, analysis and storage of increasing volumes of information. New software platforms are used to interpret data and feed back information enabling optimization of operations and maintenance. As the speed and sophistication of real-time analytics increase, insights can be promptly looped back into the decision process.
The physical world is increasingly getting online as objects, devices and machines acquire more digital intelligence, while advances in connectivity mean that objects can be wirelessly integrated into information networks. The last few years have seen a huge increase in the use of wireless sensors and instruments in power plants. These observe and monitor their environment, communicating information about temperature, pressure, flow and vibration from the heart of the power plant back to the control centre.
Big data is generating datasets that are increasing exponentially in both complexity and volume. Analyzing, storing and applying this data is a considerable challenge. Companies such as GE are building cloud-based services with intelligent analytics to collect and combine vast amounts of data to use in industries which include power generation.
Ambient and load conditions significantly affect gas turbine and combined cycle performance, thus process simulation plays a key role in every large project.
Sophisticated software is used to simulate the thermodynamics and drive the optimization of the power plant. Experts can compile detailed models of gas turbines and all major components and simulate plant operations under the entire range of ambient and load conditions. Linking market models incorporates financial and environmental information so that operational costs can be projected.
Austrian software specialist VTU Energy is currently modelling a large combined water and power plant for a bidder for an IWPP (Independent Water & Power Producer) contract in the Middle East.
The developers pull together technical information from vendors of gas turbines, desalination units and other components which VTU feeds into its overall plant model for use in the bid process.
The company uses the Ebsilon Professional heat balance software and its own Gas Turbine Library to build an accurate plant simulation model. This is used to find a commercial optimum while meeting the requirements of the tender and producing the most competitive tariff.
The contract is for a power and water purchase agreement for 25 years and bidders must submit around 100 documented operating points so the government can evaluate the bid in technical terms. VTU’s Dr Josef Petek explains why the simulation plays a vital role in putting together the bid:
“There is a very strong emphasis on the documentation of the technical capability of the plant. You are going for a long-term relationship that includes performance guarantees. The government owns the gas and is the sole buyer of the products, electricity and water, so understanding the efficiency and capacity of the power plant is essential.”
Thermal power plant generation has traditionally been dictated by load demand, but the increasing supply of intermittent renewable energy, notably in Europe, is transforming the pattern of demand, so that combined-cycle plants are called upon to provide flexible peak load. For an existing power plant operating in a deregulated competitive market, accurate prediction of plant capacity and fuel consumption under expected conditions for the days ahead is essential. Weather information (notably the likely availability of wind, hydro and solar) should be factored in to the equation.
In order to bid into the market, plant operators need the ability to make accurate predictions of future operational costs. Under a capacity market operators will be looking to predict how best to operate in the next two weeks, informing their bids into the market.
Petek says: “In the past, in a regulated market, you could look back at the price of operation to make a price. Now the historic price of power is less significant: you have to bid into the future, looking at where the market is going, depending on the renewables.”
Factors will include likely demand and weather forecast for the next two weeks.
“Based on this predictive work, you feed in your fossil generation and look at how many startups do you want to use in the next two weeks, what will be the cost of stop-and-go generation. Does it make sense to have low-load parking position? Does it make sense to reduce load to a minimum because electricity price is so low that it does not pay to shut down the plant and you are maybe paid for quickly ramping up and producing if immediate need arises?”
In the past, efficiency was the priority for combined-cycle plant, but on networks that give priority dispatch to renewable sources it is increasingly flexibility, startup time and low load operation. Operators will rely on emerging predictive software to provide the answers.
Big data and cloud computing
Number-crunching the mounting volumes of data from a power plant requires additional processing power and broadband capacity. The aim is to increase efficiency of both operations and asset maintenance through improved understanding of the plant processes.
Eric Kauffman, Software & Analytics Strategy Leader at GE, says: “We have created a cloud-based version of our Efficiency Map product which brings data back centrally. This enables us to get better data by adding a reconciliation element that would be too complex to do on-site. It also enables our engineers to provide a second set of eyes for the customer.”
He cites an example of the benefits: “Calculations using plant sensors might show a curve with uncertainty rate of 1-2 per cent. By using a combination of a physics model, a precision test and the data reconciliation algorithm we have been able to demonstrate overall uncertainties of below 1 per cent.
“This gives customers better visibility to the existence and location of problems in the plant – and, more importantly, where the problem is not. It also gives a better understanding of plant capability so purchasers are able to buy fuel more efficiently and reduce the amount of safety margin that traders put into their bid.
Kauffman estimates that, in some deregulated US markets, an improvement of 1 per cent in accuracy can be worth over half a million dollars per year in a typical combined-cycle power plant. Increasing the amount of data from within the power plant means installing smart devices and instruments. GE has done a lot in development around smart bus technology to reduce the number of terminations required to install the technology. Reducing wired connections saves parts, costs and time.
Mark Hachenski, executive product manager at GE, explains: “By using smart bus technology, we dramatically reduced the installation time by decreasing the number of wires and terminations. For instance, we have reduced 3600 terminations down to 1800, a 50 per cent reduction in the plant.”
Hachenski gives an example of how hardware developments go hand in hand with advances in communications technology: “In addition, smart bus technology can provide higher reliability for customers.
“In the past we used hydraulic fuel skids which would only annunciate four or five analogue diagnostics. Moving into the twenty-first century, with a smart bus electronic fuel skid we get 60 digital health bits to come back into our system. This provides better visibility of what is going on for quicker actions to resolve problems.”
However sophisticated the software and remote control systems, human operators are an essential part of the system. It is not helpful to swamp them with masses of data: it is important that the information presented by the software is relevant, accessible and comprehensible.
Hachenski explains GE’s approach to making screens intuitive and user-friendly: “One of the things we have done is to redesign the screen ‘look and feel’ so that operators can easily and quickly see how the plant is running. We are trying to visually represent what is important to the operator. We do not want to burden the operator with a tonne of information or with too many alarms.” Alarms alert the operator to a change, inform the operator of the nature of the change, and guide the operator toward a course of corrective action.
GE leverages its new GE Software business and team of user experience experts to help make the software and experience more user-friendly. The interface was tested by bringing in operators to use simulated screens while observers watched them walk through various alarms or faults in the system.
Hachenski says: “Customers tell us that operators make mistakes as they try to manually start up systems. We are looking at how to automate the process so that a startup screen walks them through a step-by-step sequence. However while some customers want operators to follow this, others don’t want operators interfering, so the software interface must accommodate different operators.”
Wireless mesh networks
The development of low-cost, flexible wireless networks has opened up the potential for additional data collection from the heart of the power plant, increasing real-time understanding of the way that processes and components operate. This also allows for performance validation and improved maintenance practices, including predictive approaches.
Traditional networks rely on a small number of wired access points or wireless hotspots for communication. In a wireless mesh network, the network connection is spread out among numerous wireless mesh nodes (smart transducers and devices acting as transmitters that function in the same way as a wireless router) and share the network connection across a large area. Information travels wirelessly across the network from one mesh node to the next.
The nodes are programmed with software that tells them how to interact within the larger network, and dynamic routing means they automatically choose the quickest and most reliable path. If one node is inoperative, the rest of the nodes can still communicate with each other, directly or through one or more intermediate nodes. Wireless mesh networks can self-form and self-heal.
Only one gateway needs to be physically wired to a network connection, which then wirelessly shares its connection with all other nodes in its vicinity.
Efficiency Solutions Manager Jeff Williams of Emerson Process Management is enthusiastic about the potential of wireless mesh networks to improve data collection from power plants.
“Wireless networks have been available for a number of years,” he says, “but today we see them with expanded functionality. Wireless area networks in power plants are being used in parallel with the plant’s real-time distributed control system. A wireless instrument can collect information that was previously unavailable due to the impracticality and expense of hardwiring in a difficult-to-reach or harsh environment.
The data can then be integrated through the wireless network and sent to the control system so that it can be analyzed and performance adjustments made to the equipment. This leads to the best possible use of plant assets.”
In the past a small number of wireless instruments may have been used around the plant. Williams says that they are being increasingly used for a range of performance testing and diagnostics functions.
“Power generators started using wireless instruments and networks to validate performance of new equipment, but are now finding that the accuracy of wireless instrumentation is so good that it can be used it as a backup to wired devices and to verify plant performance. A wireless network can support dozens of instruments used to validate performance.”
The ease with which wireless instrumentation can be deployed compared to hardwired devices means that many new data sets are becoming readily available. As Williams says, “It is easy to hang a wireless transmitter onto a piece of equipment for temporary analysis – getting this information is something which would have previously required the use of many portable instruments as well as associated manpower.”
Wireless networks and sensors are being used to measure increasing amounts of temperature, pressure, vibration and flow data from the heart of the power plant. Intelligent devices can be coupled to intelligent networks and the information can be analyzed remotely, locally or through a combination of both.
Having collected the data, the control system’s software is able to analyze it against historical baseline data for predictive maintenance or other performance evaluation.
Williams says: “As we go forward, I think we will see a merging of technologies so that we can watch the plant in real time from the distributed control system and use simulation that is closely tracking that performance to run ‘what-if’ scenarios that will enable us to make process adjustments for more efficient operations. Synchronizing the simulation with real-time plant operation will allow asset managers to optimize plant performance, reduce environmental footprint and keep consumable costs as low as possible.”
Software specialists, plant manufacturers and operators say they are always learning from the data. Going forward, better analytics, a high degree of physics models plus big data look certain to bring operational benefits.
Performance will be diagnosed in real time, sustaining output and efficiency, while asset condition will be monitored and predicted with increasing reliability.
Sophisticated software and smart instruments can improve performance, enabling plant to get online faster from a hot or cold start and despatch faster. Software can co-ordinate between the steam turbine and the gas turbine to reduce startup time by 50 per cent.
The rate of change looks set to continue and accelerate as developments in hardware, software and communications yield increasing amounts of information about conditions at the heart of the plant. GE’s Hachenski says: “We are learning every day. The amount of information we are getting that we didn’t have before gives us so much more visibility. We have only scratched the surface of figuring out how to take it to the next step.”
Penny Hitchin is a journalist focusing on energy matters.
Power Engineering International Archives
View Power Generation Articles on PennEnergy.com