All Publications‏ > ‏

Data Compression Technology Speeds Power Quality Analysis

נשלח 1 באפר׳ 2010, 8:04 על ידי Power Quality Doctor

An engineer’s main objective when troubleshooting a power quality event is to identify the source of the disturbance in order to determine the required corrective action. To identify the source, the engineer depends on recorded data captured by monitoring equipment.

Management demands a cost-effective solution to the problem be implemented as quickly as possible. The electrical engineer speaks of installing instrumentation, collecting data, analyzing data, re-installing, and re-analyzing. It is not uncommon for months to pass until the problem is isolated and a solution is implemented.

Power quality analysis has traditionally posed a unique challenge to the engineer, demanding an accurate assumption as to the dimensions of the disturbance in order to capture the event to memory for examination. The correct balance between memory size and the deviation of the disturbance from the norm is often elusive. Thresholds set too low capture too many events of little or no consequence, filling the memory before the sought after damaging event occurs. Setting the threshold too high can overshoot the event.

Data Compression Technology

Revolutionary data compression technology takes the guesswork out of isolating the source of power quality problems by eliminating the need for devising set points and calculating threshold values.

The ability to capture all the waveform data in high resolution in its entirety over an extended period of time is the only way to ensure that the event will be recorded, allowing the engineer to analyze the data and define a solution.

Until now, monitoring and analyzing system electrical trends have presented a true challenge because certain data compromises were required to counteract capacity, processing, and physical limitations. Data compression technology provides unlimited capacity for power quality data storage. This eliminates the requirement to set constraints on system data, rendering the risk in data selection based on set thresholds and triggers obsolete.

Operators of electrical networks are constantly faced with power events and transient occurrences that affect power quality and heighten energy costs.

In the past, to determine whether such events reflect system trends or isolated incidents, electrical engineers relied on partial information indicating what events occurred and when; not all events were recorded due to data capacity limitations and missed thresholds. Now, engineers analyzing multi-point, time-synchronized real-time power quality data can actually see why all power events occur and what causes them. In short, data compression technology pushes power quality analysis capabilities into the next generation.

Data compression technology allows for both immediate power quality problem solving as well as for true proactive energy management. The ability to analyze all data at any time enables energy managers to call up and analyze historic time-based energy consumption trends in order to make supply side decisions. Data compression technology allows control over both the consumption and quality of the supplied energy.

Considerations for optimal system functionality in diverse network topologies are based on the capabilities of the energy suppliers, service providers, and industrial and commercial consumers of energy to provide power quality over time and to successfully analyze, predict, and prevent energy events using multi-point, historic, and true-time logged data.

Achieving Benefits

The U.S. Department of Energy estimates that about $80 billion a year is lost to power quality issues. To reduce these losses, operators must identify the source of power events, identify the problem sources, and prevent their re-occurrence.

Problem sources are many and often reflect the need for predictive and preventative maintenance measures. Utility operators face problem sources such as capacity, weather conditions, and equipment failures. Consumers suffer from equipment failures, faulty installations, and incompatible equipment usage creating destructive resonant situations.

When effective monitoring is installed, power providers will strive to avoid negative impacts due to diminished quality and service capabilities, so as not to cause damages due to the following factors:

In industrial sectors:
  • downtime
  • product quality
  • maintenance costs
  • hidden costs (reputation, recall)

In commercial and service sectors:

  • service stoppage
  • service quality
  • maintenance costs
  • hidden costs (reputation, low customer satisfaction level)


Once a power quality event is fully characterized by accessing compressed power quality data, a solution can be implemented successfully.


Analysis Resources and Capabilities

Implementing data compression technology in an electrical installation means:

  • Needed information is stored; there are no more data compromises to counter recording resolution and capacity issues
  • Years of data for every network cycle are available with no data gaps
  • Thresholds and triggers are no longer needed; missing events becomes a thing of the past
  • All data parameters are recorded; there is no need to select measurement parameters
  • Comprehensive power quality reporting and statistics for data analysis and report generation are accessible and organized
  • Multi-point time-synchronized recording provides a true snapshot for any period in the entire network
Over the years, various technologies have evolved for monitoring and logging power quality data. Su rely, throughout this period, developers addressed the same challenges regarding potential power quality, data capacity, and system trends. Ultimately, the analysis of sampled data serves to manage, maintain, and optimize system operations and costs.

Four Technological Generations

It is possible to delineate four distinct generations in the development of power quality technology:

  • 1st generation, power meter/monitor: First-generation technologies provide display capabilities only. Utilizing analog or digital technology, logging information is used for monitoring the system.
  • 2nd generation, data logger: Second-generation technologies use periodic logging mechanisms and present data in paper or paperless form. Still, the information is utilized for system monitoring only.
  • 3rd generation, event recorder/power quality analyzer: Third-generation technologies require the setting of thresholds and triggers, which are always difficult to assess correctly given that memory capacity is finite and quickly filled. When values are set too low the capacity is filled instantly; when values are set too high very few events are recorded.
  • 4th generation, power quality data center: Fourth-generation technology provides limitless, continuous logging, and storage of power quality data using data compression technology. Setting of parameter values, thresholds, triggers, and other constraints on data are no longer required.

Additionally, a troubleshooter can determine why power quality events occur over the entire electrical network and then successfully identify what causes them, regardless of their cycle occurrence. This measurement and analysis technology enables the engineer to optimize electrical network efficiency and cut power quality losses by relying on the analysis of ungapped data.

Data Analysis Advantages

Data compression can help optimize analysis activities by introducing multi-point time-synchronization to the process (see figure 1 on right). Troubleshooters can trace energy flows over the network during power events to determine event causes. It is also important to log network energy flows when there are no events occurring. Also, logging is necessary at all other points while an event occurs at a specific point to correctly analyze the event.

During power quality events, impedances change. Using fourth-generation technology, it is possible to calculate impedances and perform accurate network simulations for comprehensive analysis.


Examples

Question: What was the source of the voltage sag?

Answer: The possible source could be either one, or a combination, of the illustrated events. Other factors could also be involved. Monitoring multiple sites simultaneously and continuously allows the engineer to see the whole picture – all the time. Power quality events can be examined at the time of the event,

and in the context of the timeline before and after the event; comparing the impedances at this site at different times.

Question: What causes data bottlenecks when logging data with non-compression technologies (see figure 2 on right)?

Answer: Data bottlenecks with non-compression technologies are caused by limitations in recording speed capabilities, storage space, communication throughput, and computer processing capacity. 


Question: How are data bottlenecks eliminated by using data compression technology?

Answer: Data bottlenecks in the logging process are eliminated using data compression technology (see figure 3 below): the CPU compresses the data; a compact flash is utilized instead of a hard disk; data are compressed so capacity is not a factor; in addition, block-oriented processing is implemented.

Implementing Data Compression

Patent-pending PQZip data compression technology is employed by the Elspec G4400 Power Quality Data Center and implements:
  • Compression algorithm with typical 1000:1 compression ratio. This real-time compression, performed independent of the sampling, prevents data gaps.
  • Multi-point implementation of time-synchronized devices over the entire grid shows the interactivity of the values recorded at the different points in the network at that point in time 
  • Infinite continuous logging and storage of data for total network analysis.


Summarizing Benefits

Of the four generations of technological evolution for storing power quality data for analysis, only fourth-generation data compression technology affords the unprecedented advantage of infinite, continuous logging and storage of high-resolution data. Using this new technology avoids capacity issues and for this reason yielded data are entirely uncompromised. This represents a clear advantage when analyzing system power trends and events. The natural and desired outcome of in-depth system analysis is prediction and prevention of power events, reduced power costs and the constant supply of enhanced power quality.


This article was printed on December 2006 issue of Energy & Power Management magazine (pp. 18-20).

Comments