Ted Hoff’s General Purpose Microprocessor

“…even though science and technology are wonderful, what really gets them out there for people to use is to have businesses built around them. It takes savvy businessmen as well as savvy technologists to make that work.”

Tedd Hoff

Background

Ted Hoff had access to then state-of-the-art vacuum tube circuits in high school. In 1954, he graduated and gained access to then-new transistors and magnetic core memory. Eventually, he earned a bachelor’s degree when came to Stanford, earning a Ph.D. in 1962.

During that time, he talked to Rex Rice, an on-campus recruiter for Fairchild Semiconductor. Particularly, the Traitorous Eight founded Fairchild and Doriot student Arthur Rock funded the business.

Hoff believed the new field of integrated circuits could work well for memory, replacing the clunky and relatively enormous core memory. Eventually, this led to a referral to Bob Noyce. He worked at Fairchild but was starting a new company, Intel. Evidently, Noyce intended Intel to focus on semiconductor memory and was searching for somebody with Hoff’s background.

Intel

In 1970, while waiting for the technology to mature, Intel decided to build one-off chips for the desktop calculator market. Eventually, Hoff was assigned to assist building a chip for Japanese company Busicom. At first, Japanese engineers were expected to do all the work with Hoff acting as a liaison and coordinator.

However, Hoff noticed the Japanese design was sub-optimal. There were about a dozen chips and the entire system appeared needlessly complex. Hoff raised his concerns to Noyce who encouraged him to make a “backup” design.

Hoff’s design incorporated random access memory and programmability. It was vastly simpler yet overall more powerful by being programmable rather than single-purpose. After a meeting, the customer adopted Hoff and Intel’s design.

Federico Faggin joined Intel and refined Hoff’s idea, optimizing the general-purpose chip to take advantage of Intel technology. By January 1971, the team had a fully functional microprocessor.

The Microprocessor is Born

Their original goal was an embedded system, not a PC chip. Embedded systems are specialty chips that people never see; they make other machines work. The final chip, renamed the Intel 4004, contained between 2,100 and 2,300 transistors, depending upon how one counted. In 1974, Intel’s 4004 was followed by the 8008 then the 8080. That chip became the foundation of the Altair, the first microcomputer. The Altair inspired a young Bill Gates and Co. to start a software company and a young Steve Jobs and Wozniak to form a computer company.

Cordless Tools

In 1895, C&E Fein, a German company, invented the first electric tool. It was a handheld drill weighing 16.5 pounds. The drill was underpowered because it ran on DC electricity. It also required two people to operate.

In 1910, Duncan Black sold his car for $600 and used the funds to open a machine shop in Baltimore. His friend and business partner, Alonzo Decker, joined the venture.

Their first project involved improving the C&E Fein electric drill. They looked towards Colt’s pistol handle to envision a power drill small enough for one hand with a pistol grip. The 1916 Black & Decker power drill was vastly lighter, stronger, and required only one person to operate.

At first, Black & Decker only sold their power tools to other businesses. Eventually, they realized the consumer market was also interested in the convenience of power tools and built their business-to-consumer channel. By the early 1920s, the company was advertising power tools in popular newspapers and magazines.

Eventually, other companies created various tools. Over time, power tools became the norm.

In 1961, Black & Decker took the innovation one step further and invented cordless power tools. Like the original C&E Fein drill, the first cordless power tools were heavy and underpowered. However, even with these drawbacks, the benefits were obvious.

In 2005, Milwaukee Electric Tool Company released the first lithium-ion tools. These changed the industry, making cordless tools powerful, long-lasting, and easy-to-use.

Today, virtually every tool imaginable run on batteries. Drills, saws, sanders, chainsaws, and even lawnmowers utilize battery-driven electric motors.

Electronic Desktop Calculator

Desktop calculators led the idea of computers small and cheap enough to sit on an individual’s desk. Eventually, they also became the impetus for the general-purpose microchip.

History

The first desktop electronic calculator is the ANITA Mark VII and ANITA Mark VIIII, both launched late 1961. The Bell Punch Co. of Britain designed the ANITA. Markedly, they used vacuum tubes and cold-cathode, and nixie tubes for the numerical display. Norbert (“Norman”) Kitz led the design and engineering work.

Eventually, the ANITA VII sold in continental Europe and the ANITA VIII in the UK and the rest of the world. However, soon after launch, Bell dropped the ANITA VII and consolidated the product line.

Cost was a major factor producing the ANITA. To make the calculator, Bell Punch needed to sell the product for about 1/100th the least expensive electronic computers of the day cost. Eventually, ANITA went on the market for £355 (about £7,800 in 2018, about $10,500 USD). In contrast, the least expensive general-purpose computers in 1961 cost about £50,000 (just over £1 million adjusted to 2018). The device weighed 34 pounds (15.5 kg).

Transistor-Based Calculators

Eventually, by 1964, competitors started to release calculators that used transistors rather than tubes. Sharp, Canon, Sony, Toshiba, Wang, and countless others released transistor-based calculators. However, these calculators were similarly priced to the ANITA, or even more expensive. Significantly, were significantly smaller and lighter due to the lack of tubes.

The Soviet Union literally weighed in with the T-64 built in Bulgaria. However, despite the use of semiconductors, the calculator weighed 8kg (17.6 lbs.) and is the first calculator to compute square roots.

Calculators continued to decrease in price, size, and increase in performance.

General-Purpose Microchip

Many calculator companies hired Intel, a young company, to produce custom chips for their calculators. Eventually,  in 1970, Intel engineer Ted Hoff instead created a general-purpose chip for Japanese company Busicom. Unlike other calculator chips, the Busicom chip was programmable to do multiple functions, not only those specific to one calculator. In 1971, Intel licensed the chip back and rebranded it the Intel 4004, Intel’s first general-purpose microprocessor.

Mass-Scale Desalination

Reverse osmosis enables large-scale desalination of seawater, efficiently transforming it into drinking water.

People have been trying to desalinate seawater into drinking water for millennia. Firstly, Aristotle and other ancient Greeks unsuccessfully attempted to desalinate seawater. Eventually, by the 16th century, ship-based evaporation desalination systems created potable water. In time, by 1955, the first multi-stage flash distillation (MSF) plant went online. It desalinated water but require distillation, consuming enormous amounts of time and energy.

Finally, in 1959, the first multi-effect distillation (MED) plant came online. This plant used a combination of industrial-scale reverse osmosis and filtering. Subsequently, the University of California innovated the synthetic reverse osmosis membrane. This brought together the building blocks of a modern desalination plant.

Reverse osmosis desalination methods were refined over the future decades. Particularly, the filters became vastly more efficient at filtering salt and other particulate matter from seawater with ever fewer amounts of electricity.

Eventually, as aquifers around the world run dry, desalination promises to help offset the use of natural potable water.

At this time, in 2019, Saudi Arabia has the largest desalination plant in the world. It features 8 evaporators, 17 reverse osmosis units, and produces 1 million cubic meters of drinking water every day.

Israel comes second, using 16-inch (40.5cm) membranes it produces 624,000 cubic meters of drinkable water per day. Thanks to this and other desalination plants, Israel generates more water than the country uses, using the excess to refill drained aquifers and the Sea of Galilee. A plant north of San Diego, in the US, will produce about 190,000 cubic meters of freshwater a day for Californian’s, who have suffered water rationing for years as aquifers ran dry.

Megawatt Windmill

Megawatt wind turbines are windmills capable of generating a megawatt or more of electricity and feeding it into the electrical grid.

Background

Palmer Putnam was an MIT geologist. Literature notes he has “no formal education or experience in wind power.” Of course, that was true for everybody in the 1930s. There was no such thing as a wind engineer.

Earlier, in the 1880s, Thomas Perry created the Aermotor company that generated a small amount of electricity for disconnected rural farmers. Aermotor windmills generated enough electricity to run a radio or a small pump. Up to the present time, these types of windmills, with their multiple blades, are familiar in pictures of old farms or movies.

However, Putnam’s windmill was entirely different. His 75-ft (23 meters) blades were, by far, the largest ever built. Russians like building big things and previously the Soviet Union built the largest windmill generator that produced 100-kilowatts, naming it Balaklava. Putnam was aiming for ten-times that amount of electrical output.

General Electric was supportive of the idea and supplied the generator, limited capital, and connections to people. The Dean of Engineering at MIT, Vannevar Bush, reviewed Putnam’s work and agreed the megawatt windmill was a real possibility. A company that built turbines for dams funded most of the project, envisioning a growth opportunity in renewables beyond dams.

However, as Putnam worked to harness winds, the winds of war were blowing across the US. There was a feeling it was only a matter of time until the US entered WWII, where engineers and metal were needed for war.

Like today’s wind turbines, there were only two large blades. When he finally turned the windmill on, in late 1941, it generated 1.25 megawatts.

WWII

In 1943, the windmill broke and was stalled until after WWII due to a lack of materials and people. After the war, it functioned for three weeks and broke again, due to a sub-quality wartime repair. Finally, in 1945 a study showed wind power would cost 50% more than coal-fired electrical plants and the project was abandoned.

Today, in 2019, wind power costs less than any other form of power including coal. Windmills require no fuel and emit nothing, unlike coal-fired plants with their harmful emissions. The largest windmills operate offshore and generate 9.5 megawatts of power. Every year, windmills increase in size and capacity while lowering price.

Blade Assembly
Palm Putnam and Stanton Dornbirer of the S. Morgan Smith Co.

DDT

“We are accustomed to look for the gross and immediate effects and to ignore all else. Unless this appears promptly and in such obvious form that it cannot be ignored, we deny the existence of hazard. Even research men suffer from the handicap of inadequate methods of detecting the beginnings of injury. The lack of sufficiently delicate methods to detect injury before symptoms appear is one of the great unsolved problems in medicine.”

Rachel Carson, Silent Spring

Background

DDT is a strong insecticide especially potent for killing mosquitos. It is also environmentally hazardous, especially to birds. DDT dramatically reduced malaria and typhus during WWII but was also responsible for the near-extinction of several birds, most notably the bald eagle. Furthermore, a book about the harmful effect of DDT is responsible for kickstarting the modern environmental movement.

Dichloro-diphenyl-trichloroethane (DDT) is the first modern insecticide, widely used in the 1940s. The National Pesticide Information Center describes it as “effective, relatively inexpensive to manufacture, and lasts a long time in the environment.” DDT killed pests, especially mosquitos. Rats and mice fed DDT become sterile. During WWII, DDT controlled malaria, typhus, body lice, and bubonic plague.

Scientists at the time believe DDT to be harmless to humans and non-pest wildlife. People sprayed DDT both indoors and out.

DDT spray on beach

With its magic ability to kill disease-carrying pests, and nothing else, the chemical was ubiquitous. The soil half-life of DDT is 2 to 15 years and the aquatic half-life is about 150 years.

However, it eventually became clear that DDT caused birds to lay thin eggshells that oftentimes cracked. Eventually, scientists realized that the widespread use of DDT did have detrimental effects. Among other things, bald eagles nearly went extinct due to thin eggshells caused by DDT.

Silent Spring

In 1962, author Rachel Carson was already famous for her books on environmentalism. Various people alerted her that there were fewer birds each spring. They tied this to cracked shells, tracing that to the widespread aerial spraying of DDT to control mosquitos.

Carson released the book Silent Spring detailed problems with DDT. “How could intelligent beings seek to control a few unwanted species by a method that contaminated the entire environment and brought the threat of disease and death even to their own kind?” Carson acknowledged the problems of pest-borne disease and urged to use chemicals sparingly.

In 1972, the newly created US Environmental Protection Agency banned DDT except in the case of malaria outbreaks. Other countries followed suit. Today, the widespread use of DDT is banned virtually everywhere. However, the chemical is still used on an emergency basis to control malaria. Since the ban, bald eagle populations have largely recovered.

Supertall Skyscraper

Buildings higher than 300 meters (984 ft.) are supertall skyscrapers. During the late 1920s there was an unofficial competition to see who could build the tallest building in New York City.

Background

In 1913, the Woolworth Building was the highest in New York City, at 792 feet.

New York architects William Van Alen and Craig Severance were business partners, but the partnership dissolved bitterly. Each received a commission to build a tall building and each determined to outdo the other.

In 1929, Severance’s Bank of Manhattan Trust on Wall Street rose to an extraordinary 927 feet (282 meters). However, Van Alen secretly built a 185 ft. (56 meters) tall spire inside his skyscraper, the Chrysler Building. In 1930, it became the tallest building in the world at 1,046 feet (318 meters).

At a total height of over 300 meters, many argue Van Alen’s building arguably became the first supertall skyscraper. However, controversy remains whether the spire counts towards total height.

The question became one more for the trivia books when, in 1931, the Empire State Building soared to 1,250 feet (381 meters). The tower was the tallest in the world. William F. Lamb was the lead architect.

Later Buildings

The Empire State Building was the tallest building in the world for forty years. Eventually, in 1971, the first of the Twin Towers surpassed the record. They were 1,362 ft. (415 meters) and 1,368 (417 meters) respectively. In 1973, the 1,450 ft. (442 meters) Sears Tower in Chicago topped that height, holding the tallest building in the world for decades.

After the Sears Tower, Americans dialed back on height-mania. Several skyscrapers in Asia and the Middle East passed the American towers in height.

Terrorists destroyed the World Trade Center buildings in 2001, murdering 2,996 people trapped inside. In 2014, One World Trade Center replaced the Twin Towers. It rose to 1,775 ft. (541 meters), making it the tallest building in the US.

Health Insurance

Background

Germany has the oldest health insurance system, the Sickness Insurance Law of 1883. Employers paid one-third and employees two-thirds. The insurance covered both medical treatment and sick leave. In 1911, the UK created basic health insurance. Russia followed in 1912, nationalizing all healthcare after the Russian Revolution of 1917.

Most countries rolled out some form of Universal Healthcare Coverage after WWI. Some countries for care directly, including the UK and Canada. There are no co-pays.

France created a hybrid system. The government program pays for basic care. However, individuals are obligated to purchase supplemental policies from private non-profits insurance companies. Most French doctors and clinics are private whereas most hospitals are government-owned. Many service providers require out-of-pocket payment which is reimbursed, via a single-payer system, to the patient’s bank account.

Switzerland is entirely private but people must purchase healthcare policies.

Regulation of Medical Costs

All countries, besides the United States, regulate rates for care and medicine. For example, in France’s semi-privatized system, the government sets a reimbursement rate for ordinary physician visits. Doctors may charge more and patients pay out-of-pocket or have a supplemental policy which pays all or part of the different.

Health Insurance in the United States

The United States took a different approach than the rest of the world, leaving most people covered by entirely private loosely regulated insurance plans. There are government programs for the elderly and poor people. Government workers, including soldiers, enjoy health coverage. Government retirement plans usually include high-quality health coverage. However, the vast majority of Americans carry private insurance or are uninsured.

US law prevents the US government from negotiating lower drug prices, legalizing price gauging. Drugs in the USA routinely cost many times what the same drug, from the same manufacturer, produced in the same factory, costs in Europe.

Finally, US health insurers are exempt from antitrust law. That is, they may openly and legally collude with one another to drive up prices and profits.

These provisions create an extremely expensive healthcare system in the US. Besides the obvious problems, American healthcare providers spend an enormous amount on overhead trying to negotiate with the myriad of healthcare providers. Additionally, healthcare providers spend enormous sums trying to minimize prices. None of this leads to better care.

In 2016, the United States spent 17.2% of GDP on healthcare, compared to 8.9% of the 36 OECD countries. However, American healthcare outcomes are mediocre at best.

Panama Canal

The 80 km. (50 mi.) long Panama Canal connects the Atlantic and Pacific oceans, avoiding the need to sail around South America.

Background

Unquestionably, the French were stoked after their completion of the Suez Canal. Given that the project took 3800 years from start to finish their enthusiasm is understandable. Subsequently, they decided to undertake a canal between the Pacific and Atlantic oceans through Panama.

Count Ferdinand de Lesseps broke ground on a sea-level canal in Panama in 1880. Soon he quickly realized the sea levels are too far apart and changed his mind to a lock-and-dam system that raises and lowers ships. Nevertheless, constant rain and landslides made the work difficult. Straightaway, his construction crew kept contracting malaria and yellow fever. Finally, in 1888, he gave up and the French left.

Eventually, in 1902, US President Theodore “Teddy” Roosevelt purchased the French work for $40 million. The US needed a faster sea route between the east and rapidly developing west coast.

At the time, Panama was Columbian territory and the Columbians refused to allow the Americans rights to the canal. Rather than negotiate, the US arranged for an overthrow of the government and negotiated a lifetime lease with the new puppet regime.

Chief engineer John Wallace started work May 4, 1904, and faced the same problems as the French. Plus, the French had fallen into disrepair by then due to jungle weather. He quit after a year.

Engineer John Stevens

Railroad engineer John Stevens took over in July 1905 and realized building a canal wasn’t all that different than building a railroad. He solved malaria and yellow fever by hiring West Indian and local laborers, who knew how to live in the jungle. Stevens pivoted to railroad equipment rather than proprietary canal making equipment. Consequently, Stevens quickly realized a lock-and-dam system would require a lot of cement, a positive attribute to help offset the landslides.

Dr. William Gorgas embraced the then-new idea that mosquitos carried deadly diseases. He focused on fumigation and eliminating pools of stagnant water, vastly decreasing the mosquito population. Eventually, by November 1905, yellow fever cases ceased and malaria cases continuously declined for the next decade.

Engineer George Goethals

In November 1906 construction was on-schedule and on-budget when Stevens suddenly quit. To this day nobody knows why. President Roosevelt replaced him with Army Corps engineer George Goethals, granting him dictatorial-like powers. Goethals quashed a work strike.

By 1909 the crews were building locks to guide ships to an enormous man-made lake in the middle of Panama. In October 1913 President Wilson detonated the last dynamite blast via a telegraph in the oval office, flooding the last dry part of the canal.

The canal officially opened August 15, 1914. At a final price of $350 million, it was the most expensive construction project in US history and arguably in world history.

From start to finish, about 56,000 workers worked on the canal. The project had a fatality rate of about 10%, about four times the fatality rate for soldiers fighting WWI. On December 31, 1999, the US gifted the canal to Panama.

https://youtu.be/JYalFQuMkz4

Prefabricated Housing Components

History

Limited amounts of prefabricated components date back to ancient times. Mesopotamian’s used burnt clay bricks. Romans utilized concrete molds for aqueducts and tunnels and William the Conqueror conquered the concept. There were movable modular buildings for industry, defense, and even hospitals. However, hand construction was the norm for the vast majority of houses and buildings.

That changed in 1908 when Sears Roebuck released a new item in their catalog, houses. People could order all the parts and pieces required to build entire high-quality homes and they’d come in a kit. Sears brought standardized parts, the “American Manufacturing Method” (invented by the French) to houses.

“For $1,062 we will furnish all the material to build this Eight-Room House, consisting of Lumber, Lath, Shingles, Mill Work, Siding, Flooring, Ceiling, Finishing Lumber, Building Paper, Pipe, Gutter, Sash Weights, Hardware and Painting Material,” reads a typical ad from 1908. All houses also included free architectural plans to aid in permitting.

Some houses were modest though many were large and there was at least one mansion.

As you can see from the catalog page, this was quite a house!

Sears discontinued selling kit houses in 1940 after selling about 70,000 houses.

Modern Day

However, the idea of modular building components remains. Today, doors routinely come with frames for installation. Hand-built trusses, that hold up roofs, are virtually unheard because factory-made ones are safer and cost less. Windows routinely come preassembled and, in some places, hand-built windows are illegal for safety reasons. Countless components of modern houses, especially in the US but also elsewhere are built at factories, not job sites.

Besides prefabricated house parts, entire prefabricated houses and buildings still exist.

In addition to prefabricated parts, there are also “modular” construction units. These function like building blocks, with various parts of houses and buildings fitting together. Modular buildings theoretically cost less than one-off construction but have higher quality since the pieces are built in tightly controlled factories.

Hotel chain Citizen M uses prefabricated modules to build entire hotels, including a 300-room hotel in New York City. The Chinese famously built the 57-story “J57 Mini Sky City” in 19 days using modules.