Here we will introduce the key concepts of the AC metering unit as a system/product.
The purpose of this material is to introduce the key concepts, allowing a reader to venture into the ether and find more information on these topics themselves.
Voltage sensing is the process of measuring the voltage between two points in an electrical system.
In single-phase AC metering, this is typically live to neutral and is known as phase voltage.
In multi-phase systems, it is common to measure line-to-line voltages, referred to as line voltages.
Generally speaking, the goal of the voltage sensing circuit is to:
By far the most commmon method for voltage sensing is to use a resistive (potential) divider to scale the voltage to a safe range for input to the ADC.
Line frequency refers to the fundamental AC system frequency, typically 50 Hz or 60 Hz, depending on regional standards.
Accurate line frequency measurement is important for:
Frequency is a quality indicator of the above grid health due to strict frequency gaurantees by national supply operators - for example the UK National Grid gaurantees +/-1% deviation from 50Hz, though normal operating conditions are often much more accurate.
This makes intuitive sense when thinking of the supply side, for example when a large generator gets overloaded its operation will slow or even stall - the speed of which is directly proportional to the frequency on the grid.
Therefore accurate frequency measurement is quite a common requirements, specifically noteable in EV chargers.
Current sensing is the process of converting line current into a measurable voltage signal for the ADC subsystem.
The sensing element must provide a linear, phase-accurate, and low-noise representation of the true current waveform.
In this section we will look at common current sensing techniques in AC metering.
A shunt resistor is a precision low-ohmic resistor placed in series with the current path.
The voltage drop across the resistor is proportional to current by Ohm’s law:
\[ V_\mathrm{shunt} = I_\mathrm{line} \cdot R_\mathrm{shunt} \]
Advantages | Disadvantages |
---|---|
- High accuracy and linearity - Excellent phase response - Low cost and simple to implement - Tamper resistant | - No galvanic isolation - Power dissipation increases with current - Must be placed carefully to avoid thermal drift and EMI coupling |
Shunts are most common in low-cost single-phase or direct-connected meters.
A current transformer (CT) provides galvanic isolation between the line and measurement circuit.
It works on the principle of magnetic induction, where the line current through the primary winding induces a proportional current in the secondary winding:
\[ I_\mathrm{secondary} = \frac{I_\mathrm{primary}}{N} \]
Where \( N \) is the turns ratio.
This secondary current is then converted to a voltage through a burden resistor and filter network to be fed into an ADC.
Advantages | Disadvantages |
---|---|
- Provides isolation and safety - Low insertion loss - Suitable for high-current applications | - Limited bandwidth and possible phase shift at low current - Requires burden resistor for voltage conversion - Potential for core saturation issues - Susceptible to tamper |
A Rogowski coil is an air-cored current sensor that measures the rate of change of current:
\[ V_\mathrm{rogowski} = -N \frac{d\phi}{dt} \]
Where \( \phi \) is the magnetic flux and this can actually be rewritten as:
\[ V_\mathrm{rogowski} = - \frac{\mu N A}{2 \pi r} (\frac{dI}{dt}) \]
Where:
Advantages | Disadvantages |
---|---|
- Excellent linearity and very wide bandwidth - No magnetic core (no saturation) - Provides galvanic isolation | - Output requires integration to reconstruct current waveform - Sensitive to positioning, meaning extra mechanical/assembly considerations |
Rogowski coils are often used in industrial meters or power quality analyzers where high accuracy and wide dynamic range are needed.
Power and energy measurement combine the sensed voltage and current to compute instantaneous and averaged quantities.
Power computation involves:
Energy is obtained by integrating power over time:
\[ E = \int p(t) \, dt \]
Digital meters perform this discretely by:
Many meters include a pulse output (e.g. LED or open collector) proportional to energy consumed.
This pulse output rate is a defining feature of a meter and is generally stated on the meters label to enable readouts.
This feature is called the Meter Constant and is generally provided in imp/kWh.
If a meter has a constant of 1000 imp/kWh then for every 1000 pulses, 1kWh has been output - or in other words 1 impulse equals 1 Wh (or 3600 Ws) of energy.
Pulse outputs are:
The topology defines how voltage and current sensing channels are connected to measure one or more phases of a system.
Used in most domestic installations.
Measures one phase and neutral pair; both voltage and current are sensed from the same line.
The phase is generally tapped of a larger utility three phase supply.
Common in North American systems.
Generally created when a utility supply phase (240V) is split using a centre tapped transformer.
This provides two 120V "legs" noted in the image as A and B, while the centre tap is connected to earth ground.
The loads are connected to either leg A or leg B, while larger domestic loads (ovens & washers etc.) are connected to the 240V load.
Used in industrial and commercial systems.
Involves measurement of three phase voltages and currents (generally with a neutral reference).
Supports:
Meters are typically classified based on application and current rating.
Designed for residential use, typically up to 100 A per phase.
Single-phase or split-phase configurations.
Often include communication (e.g. Modbus, M-Bus, or wireless).
Used in three-phase installations with higher currents and voltages.
Support advanced features such as:
Meters are often marked with a current rating such as 230V 5(80)A 50Hz Class 1:
This means the meter is accurate up to 80A, with calibration referenced at 5A, 230V & 50Hz achieving Class 1 accuracy.
Logging refers to the continuous or periodic storage of measured quantities (e.g. energy, voltage, current, frequency, temperature).
For billable meters logging is almost always a legal requirement.
Typical logging requirements include:
Logging intervals can range from seconds to hours, depending on application requirements and available memory.
Calibration is the process of adjusting the measurement system so that its readings accurately correspond to known reference standards.
In AC metering, calibration ensures that measured quantities such as voltage, current, power, and energy align with their true physical values across the meter’s operating range.
Calibration directly affects:
Calibration can be performed during manufacturing or field servicing.
It involves injecting known reference signals and adjusting the digital or analog correction coefficients so that the meter reports within tolerance.
Typical calibration stages include:
Meters are often calibrated using one of the following methods:
Generally calibration is done on an automated rig in a high production volume environment where computer-controlled sources & reference meters (0.02% accuracy or better) generate precision load/operating conditions and trigger a calibration across a batch of meters who store the data in non-volatile memory.
Tampering refers to deliberate or accidental actions that alter a meter’s intended measurement behaviour.
Detection and logging of tampering events are crucial for billing accuracy, safety, and regulatory compliance.
Common types of tampering include:
Modern electronic meters incorporate multiple mechanisms to detect and record tampering:
When a tamper condition is detected:
All tamper detections should be recorded in non-volatile memory, forming part of the legal metrology record.
Logged data typically includes:
These logs can be retrieved using protocols such as DLMS/COSEM, Modbus, or proprietary field tools for analysis and audit.
Compliance in AC metering defines the set of standards, directives, and certification requirements that an energy meter must meet to be legally sold or installed in a given market.
It ensures the product meets technical performance, safety, electromagnetic compatibility, and accuracy criteria defined by national and international authorities.
At the highest level, global conformance is defined by the International Electrotechnical Commission (IEC) — a non-profit organization responsible for standardizing electrical and electronic systems worldwide.
IEC standards define what constitutes a compliant metering device from a technical standpoint and form the foundation upon which regional and national standards are based.
Common IEC standard families relevant to AC metering include:
Category | IEC Series | Description |
---|---|---|
EMC | IEC 61000 | Electromagnetic compatibility (EMC) standards defining immunity and emission limits for electronic equipment. |
General Metering | IEC 62052 | Defines general requirements for electricity metering devices (AC/DC systems). |
Specific Metering | IEC 62053 | Defines accuracy and performance classes for static and electromechanical meters. |
Manufacturing / Inspection | IEC 62058 | Defines methods for acceptance inspection and verification of newly manufactured meters. |
Regions or trade blocs (e.g. EU, UK) translate international standards into legal frameworks that products must comply with to be marketed.
These frameworks define requirements — which are then supported by standards that provide the technical means to prove compliance.
Typical compliance hierarchy in the EU/UK:
Step | Entity | Purpose |
---|---|---|
1 | Requirements (2014/32/EU – MID / UK MIR 2016) | Defines legal and functional requirements for measuring instruments. |
2 | Standards (EN 50470-1, EN 50470-3) | Technical standards detailing design, accuracy, EMC, and environmental tests. |
3 | Testing (Accredited Test House) | Perform tests against standards to verify compliance. |
4 | Certification | Test house issues certification if requirements are met. |
5 | Declaration of Conformity (DoC) | Manufacturer issues DoC declaring the product compliant. |
These define the legal framework for electricity meters, specifying what must be tested, documented, and certified.
Testing is typically performed per BS EN 50470-1 and BS EN 50470-3.
While these are general EMC directives, specific test methods come from IEC 61000 series standards.
Meters are tested for:
These tests confirm that the meter operates reliably in realistic electrical environments.
Accuracy defines how closely a meter’s measurement corresponds to the true electrical quantity.
In the EU and UK, accuracy classes are defined under MID/MIR and EN 50470-3, while globally they trace back to IEC 62053.
Region | Standard | Accuracy Classes |
---|---|---|
IEC | IEC 62053-21 / -22 / -23 | Class 2, Class 1, Class 0.5, Class 0.2S, etc. |
EU / UK | BS EN 50470-3 | Class A, Class B, Class C |
Mapping between IEC and EN accuracy classes:
EN 50470-3 (EU/UK) | IEC 62053 Equivalent | Relative Accuracy |
---|---|---|
Class A | Class 2 | Least accurate |
Class B | Class 1 | Typical residential accuracy |
Class C | Class 0.5 | High precision / industrial use |
To achieve compliance, a typical AC energy meter must be verified against multiple domains of standards:
Category | Examples / References |
---|---|
Accuracy / Metering | EN 50470-1 / EN 50470-3, IEC 62053-21, IEC 62053-22 |
EMC | IEC 61000-6-1 / 6-2 (immunity), 61000-6-3 / 6-4 (emission) |
Mechanical, Safety & Environmental | IEC 62052-11 / 31 (general & safety) |
Manufacturing / Inspection | IEC 62058-11 / 31 (factory acceptance & inspection) |