"Computers" used to be people

In today's English lexicon, the word "computer" almost exclusively refers to electronic devices — but it used to be a human job.

"Computers" used to be people.

Science & Industry

I n today's English lexicon, the word "computer" almost exclusively refers to electronic devices — but it used to be a human job. For centuries, "computer" meant "one who computes," particularly in an astronomical observatory or as a surveyor. This definition dates all the way back to the early 1600s, long before even the most primitive digital computing machines existed.

The role of computers was, more often than not, filled by women. Although the work required a great deal of skill and made major contributions to the field of astronomy, computing was considered clerical work. In the 1870s, the Harvard College Observatory hired several dozen women as computers, who compared photographic plates of the night sky and painstakingly measured the differences in stars' positions. Among them were Williamina Fleming, who pioneered classifying stars by temperature; Annie Jump Cannon, who created the letter stellar classification system that scientists still use today; and Henrietta Swan Leavitt, who discovered around half of all variable stars (meaning their brightness changes when viewed from Earth) known at the time.

Perhaps the best-known human computers were those employed by NASA to make calculations by hand during critical space missions. Katherine Johnson, one of the three African American NASA computers featured in the book and film Hidden Figures (along with Dorothy Vaughan and Mary Jackson), performed calculations for the Mercury and Apollo missions, including the first moon landing.

By the Numbers

Variable stars discovered by Henrietta Swan Leavitt

2,400

Types of stars in the classification system created by Annie Jump Cannon

7

Letters in today's expanded stellar classification system

10

Panels on the basement-sized ENIAC computer, built in the 1940s

40

Did you know?

The first mechanical computer was designed in the 19th century.

In 1833, British inventor Charles Babbage had a groundbreaking idea. Having just worked on a giant calculating machine, he realized that machines could be more than just basic calculators; they could also be programmed using algorithms to solve far more complex calculations automatically. Babbage planned for his Analytical Engine, as the programmable computer was known, to run on steam, receive data via punched cards, and output results through an automatic printer. Mathematician Ada Lovelace wrote the first computer program while studying it — although, since the machine was never completed, she never got to actually try it out. The Analytical Engine was left unfinished upon Babbage's death in 1871. The concept was wildly ambitious, and technology didn't actually catch up until the 1940s with the completion of the first electronic programmable computer, the ENIAC. 

Recommended Reading

Arts & Culture

5 Major Firsts in TV History

U.S. History

6 Things You Didn't Know About the Kennedys

+ Load more
Click here

No comments:

Post a Comment