1 min read

The remote control, like the A-bomb and personal computer, began as a top-secret government experiment.

It made its debut in the American home shortly after World War II. Its first practical application was as automatic garage-door opener. In 1950, Zenith unveiled the aptly named “Lazy Bone,” which was tethered to the TV by an unsightly cable. The first wireless remote was basically a flashlight aimed at the set. Introduced in 1955, it was called the “Flash-matic” and operated on photocell batteries.

Ultrasound remotes or “clickers” (because of the noise they made) were introduced in the mid-“50s and replaced in the late “70s by infrared remotes, still the standard.

“Remotes were developed for the couch potato as a lazy way of controlling a few channels,” explains Justin Henry, director of engineering for Logitech. “But as time went on, we became increasingly dependent on them. Now, they’re a key element of consumer electronics: You can’t effectively control a device without one.”

Once a pricey extra – that could boost the cost of a TV by 30 percent – the remote has gotten progressively smaller and less expensive with the introduction of the transistor and computer chip.

In the 1990s, replacement remotes (for a single device) morphed into the multi-device universal, some of which now use radio frequencies to control devices in other rooms or cabinets.

Comments are no longer available on this story