When and Why Did the U.S. Transition from 110V to 120V Supply?

Why Did the Voltage Level Increased from 110V to 120V in North America?

Electric power transmission and distribution is not as straightforward in the U.S. as it is in IEC-following countries. For instance, most of these countries including the UK, Australia, and many European and Asian nations (except Japan) use 230V for single-phase and 400/415V for three-phase applications. In contrast, the U.S. employs multiple voltage levels for both commercial and residential applications, such as 120V (previous 110V), 208V, 240V, 277V, and 480V. Canada, which shares similarities with the U.S. and has smelteries in North America, also offers different voltage levels like 347V and 600V.

First of all, let’s clarify that electrical panels in residential premises do not supply only 110V; they also provide a 240V single-phase supply, similar to the 230V supply used in IEC-compliant countries. The difference is that 120V is available between a phase (hot) wire and neutral for small-load applications. On the other hand, 240V is available between two hot wires (Phase 1 and Phase 2) for heavy-load appliances such as dryers, stoves, and electric heaters.

About a century ago, the standard voltage in the U.S. was 110V, but over time it gradually increased to 120V (±5%). This means you may observe slightly lower or higher voltage at an outlet or main panel due to the permissible voltage drop variation.

Originally, 110V was common in the 1920s. This increased to 115V during the 1930s, then to 117V in the 1950s, which was briefly adopted as the standard. However, it was soon replaced by 120V in the 1960s, which remains the standard today.

During the War of Currents between AC and DC, it is possible that Thomas Edison initially chose 110V so that approximately 100V could be delivered at the socket outlet, compensating for the significant voltage drop in wiring conductors. At the time, electrical insulation and safety measures weren’t as advanced as they are today. Lower voltage meant a reduced risk of electrocution and other electrical hazards. At 100V, incandescent bulbs operated effectively without burning out too quickly. This may also be one reason why Japan adopted 100V as its standard voltage.

In the 1930s, as the reliability and lifespan of light bulbs improved and electric motors became more common, power suppliers increased voltage levels from 110V/220V to 115V/230V. Eventually, when equipment and infrastructure were upgraded, the National Electrical Code (NEC) standardized the voltage in North America to 120V and 240V.

Some older devices and plugs from various manufacturers are still rated for and can operate at 110V or 115V, as they are compatible with the lower end of the voltage range. Conversely, you may encounter ratings such as 125V/250V on certain appliances and outlets (such as NEMA 120V/240V receptacles) which indicate the maximum operating voltage they can safely handle. The shifting process was easy as many appliances and devices were already designed to operate on 110-120V, making the 120V standard convenient for consumers and manufacturers.

Despite the change, some people continue to refer to the system as “110/220,” a habit that lingers from earlier standards. A similar case can be seen in regions using 230V today, where many still refer to it as 220V, even though the change from 220V to 230V was officially implemented in 1989.

U.S. Transition from 110V to 120V Supply

Good to Know: The transition to alternating current (AC) power, with its advantages over direct current (DC), played a significant role. The US initially used DC at 110-120V, but the shift to AC eventually led to a standard of 120V for AC, making it compatible with the existing DC infrastructure.

When and Why Did They Changed from 110V to 120V?

The U.S. residential power supply didn’t transition from 110V to 120V in a single, sudden shift, but rather through a gradual evolution driven by practical considerations and standardization efforts.

The United States transitioned from a nominal 110V to a 120V standard for residential and commercial use primarily due to safety, convenience, and practicality considerations in the 1920s. The shift also reflected the evolving landscape of electricity distribution and the increasing popularity of AC power.

When the Transition Occurred:

Thomas Edison’s early (late 19th/early 20th century) DC distribution systems were initially around 110V. When AC power began to prevail championed by George Westinghouse and Nikola Tesla due to the use of transformer to easily increase and decrease the level of voltage. Finally, It was standardized around 110V to ensure compatibility with existing incandescent light bulbs.

Over the years, the nominal voltage gradually increases from 110V up to 117V. You can find appliances from the 1940s and 50s labeled 110V, 115V, or 117V.

Final Standardization occurred in the 1960s/1970s. The current standard of 120/240V at 60 Hz was largely solidified around 1967 and further reinforced by the National Electrical Code (NEC) changes in 1968 and 1984. By the early 1970s, the electrical industry formally raised the voltage to 120/240 volts from 110/220 volts.

Good to Know: The US also implemented 240V for high-power appliances like stoves, dryers, and air conditioners, often using split 120V circuits powered by a 240V main panel.

Why the Transition Occurred:

The gradual increase from 110V to 120V and its standardization were due to the following reasons:

Early incandescent light bulbs, which were a major load in the early days of electricity, performed optimally in terms of brightness and filament life within a certain voltage range. As filament technology improved, they could reliably handle slightly higher voltages, leading to the shift.

As more and more appliances were introduced into homes, and electricity consumption increased, a slightly higher voltage allowed for more power to be delivered without needing significantly larger (and more expensive) wire sizes. Higher voltages allow for smaller wire sizes for the same power (watts) since:

P = V × I.

Another reason for increase to cranked up the to regulating voltage on the power distribution and transmission transformers. Similarly, when more motor circuits were added in the residential and commercial applications, the National Electrical Code (NEC) eventually updated motor ratings to reflect the 120V standard, reflecting the industry’s shift. 

In term of reduces line losses, a slightly higher voltage (like 120V instead of 110V) for the same power (watts) means lower current (Amps). Since heat losses in conductors are proportional to the square of the current (I2R), a lower current reduces energy loss in the wiring. While the jump from 110V to 120V is small, it contributes to overall system efficiency.

Finally, it was the National Electrical Manufacturers Association (NEMA) who played a role in establishing 120V in 1920s as the standard. This standardization was key rule for manufacturers to produce appliances with compatibility that would work reliably across the country, and for electricians to wire homes consistently.

Related Posts:

Exit mobile version